Professional Documents
Culture Documents
Project
On
Ex-MBA: 2015-2017
Date: 23/SEP/2017
1|P age
DATA PRIVACY, RISK AND DATA PROTECTION
confidential to us.
Report
2|P age
CERTIFICATE
This is to certify that the project work “DATA PRIVACY, RISK AND DATA PROTECTION” is a bona fide
record of work done by Suneel Koul under the guidance of Dr. Pravin Kumar Bhoyar in partial
3|P age
Acknowledgements
I take this opportunity to express my sincere gratitude to Dr. Pravin Kumar Bhoyar for his sincere
guidance and encouragement in carrying out this project work. I sincerely thank all the colleagues of
Symantec Softwares India Pvt Ltd and fellow Student Managers who rendered their help during the
period of my project work. I would also like to thank all the respondents who gave their valuable time
for filling up the questionnaires and for giving valuable inputs during the exploratory research.
Suneel Koul
PRN: 15020448025
4|P age
EXECUTIVE SUMMARY
Privacy impact assessments (PIAs) and Data protection are widely used in the European
countries, especially by government departments and agencies, local authorities, national health
service (NHS) trusts and even by companies, according to a historical data survey carried out in
past along with, present ,found that more two-thirds of respondents were conducting privacy
impact assessments.
The UK was the first country in Europe to develop and promulgate a privacy impact assessment
methodology. The Information Commissioner’s Office (ICO) published a PIA Handbook in
December 2007, followed by a revision in June 2009.
It’s aimed at organizations who are developing projects that might have implications for
people’s privacy. It will help organizations assess and identify any privacy concerns (a Privacy
Impact Assessment) and address them at an early stage, rather than leaving the solutions to bolt
on as an expensive afterthought.
The Cabinet Office accepted the value of PIA reports and stressed that they will be used and
monitored in all departments as a means of protecting personal data from July 2008 onwards.
PIAs have thus become a “mandatory minimum measure” in the UK government and its
agencies.
The European Commission introduced its proposed Data Protection Regulation in January 2012,
Article 33 of which would make PIAs mandatory for both public and private sector
organizations throughout Europe where processing operations are likely to present specific
risks to the rights and freedoms of data subjects.
One of the most important piece of legislation protecting our data at present is the Information
Technology Act (hereinafter IT Act). The IT Act makes hacking and tampering with computer
source an offence and penalizes unlawful access to data. However does not prescribe any
minimum security
Standards which the entities having control of data should comply with except in cases of
Personal sensitive information. The Information Technology.
Section 43
This section provides protection against unauthorized access of the computer system by imposing
heavy penalty up to one crore. The unauthorized downloading, extraction and copying of data are
also covered under the same penalty. Clause ‘c’ of this section imposes penalty for unauthorized
introduction of computer viruses of contaminants. Clause ‘g’ provides penalties for assisting the
unauthorized access.
Section 65
5|P age
This section provides for computer source code. If anyone knowingly of intentionally conceals,
destroys, alters or causes another to do as such shall have to suffer a penalty of imprisonment or
fine up to 2 lakh rupees. Thus protection has been provided against tampering of computer source
documents.
Section 66
Protection against hacking has been provided under this section. As per this section hacking is
defined as any act with an intention to cause wrongful loss or damage to any person or with the
knowledge that wrongful loss of damage will be caused to any person and information residing in
a computer resource must be either destroyed, deleted, altered or its value and utility get
diminished. This section imposes the penalty of imprisonment of three years or fine up to two
lakh rupees or both on the hacker.
Section 70
This section provides protection to the data stored in the protected system. Protected systems are
those computers, computer system or computer network to which the appropriate government, by
issuing gazette information in the official gazette, declared it as a protected system. Any access
or attempt to secure access of that system in contravention of the provision of this section will
make the person accessed liable for punishment of imprisonment which may extend to ten years
and shall also be liable to fine.
Section 72
This section provides protection against breach of confidentiality and privacy of the data. As per
this, any person upon whom powers have been conferred under IT Act and allied rules to secure
access to any electronic record, book, register, correspondence, information document of other
material discloses it to any other person, shall be punished with imprisonment which may extend
to two years or with fine which may extend to one lakh rupees or both.
6|P age
Contents
EXECUTIVE SUMMARY ............................................................................................................................................ 5
CHAPTER-1 .............................................................................................................................................................. 9
1. INTRODUCTION ............................................................................................................................................. 9
1.1 Privacy Impact Assessment ........................................................................................................................ 15
1.2 PIA Risk Management ................................................................................................................................ 15
1.3 Methodologies .................................................................................................................................................... 16
CHAPTER-2 ............................................................................................................................................................ 18
2. Project and Technology Development Management Standards and Methodologies. ..................................... 18
2.1 Project Management Methodologies .......................................................................................................... 20
2.1.1 Project Management Body of Knowledge (PMBOK©) ............................................................................. 20
CHAPTER-3 ............................................................................................................................................................ 27
3. The Human Brain project ................................................................................................................................ 27
3.1 Data as a valuable resource for research................................................................................................................. 27
3.2 EU regulations on data protection .......................................................................................................................... 28
3.3 Individual rights versus the common good ............................................................................................................. 28
CHAPTER-4 ............................................................................................................................................................ 30
4. PRIVACY RISK MANAGERMENT ............................................................................................................. 30
4.1 The notion of privacy risk ...................................................................................................................................... 30
4.2 Level of risks: how to estimate them? .................................................................................................................... 31
CHAPTER-5 ............................................................................................................................................................ 34
5. Expression of Needs and Identification of Security Objectives (EBIOS) in the Field of privacy ................... 34
5.1 Background study: What is the context? ................................................................................................................ 34
5.2 Feared events study: What does one fear happening? ............................................................................................ 35
5.3 Risk study: What is the risk level? ......................................................................................................................... 39
5.4 Measures study: What can be done to treat risks? .................................................................................................. 41
CHAPTER-6 ............................................................................................................................................................ 48
6. Risk Management in Data Protection Regulation ........................................................................................... 48
6.1 ISO/IEC 27005:2011 Information security risk management ................................................................................ 48
6.2 NIST SP 800-39 Managing Information Security Risk .......................................................................................... 48
6.3 ISACA and COBIT ................................................................................................................................................ 52
6.4 EBIOS (Expression of Needs and Identification of Security) ................................................................................ 59
CHAPTER-7 ............................................................................................................................................................ 61
7. ISO/IEC 29100:2011 Information technology — Security techniques ........................................................... 61
CHAPTER-8 ............................................................................................................................................................ 66
8. Assessing and identifying data protection risks overview ............................................................................... 66
8.1 Completing the privacy risk register ...................................................................................................................... 68
CHAPTER-9 ............................................................................................................................................................ 68
9. Data and Data protection laws......................................................................................................................... 68
9.1 Data Classification ................................................................................................................................................. 76
9.2 Why Data Protection? ............................................................................................................................................ 83
7|P age
9.3 Why is data protection needed? .............................................................................................................................. 83
9.4 So how does data protection work? ....................................................................................................................... 84
9.5 How many countries in the world have data protection laws? ............................................................................... 85
9.6 Are data protection laws the same in all countries that have them? ....................................................................... 86
9.7 What is considered as personal information under data protection laws? .............................................................. 87
9.8 Data protection policies to safeguard Information. ................................................................................................ 87
CHAPTER-10 .......................................................................................................................................................... 88
10 Data Protection Law in India. .............................................................................................................................. 88
10.1 Data protection under foreign law ........................................................................................................................ 93
CHAPTER-11 .......................................................................................................................................................... 95
11. The EU General Data Protection Regulation .............................................................................................. 95
11.1 GDPR Timeline of Events ................................................................................................................................ 98
11.2 SOME FAQ FOR GDPR.................................................................................................................................... 100
Survey and Findings ............................................................................................................................................... 102
CONCLUSION ...................................................................................................................................................... 120
8|P age
CHAPTER-1
1. INTRODUCTION
A. What is a Privacy Impact Assessment (PIA)?
A PIA is a methodology used to assess privacy risks to living individuals in the processing of
their personal data including collection, use and disclosure of information. The reasons which
may prompt an organization to undertake a PIA are as follows:-
Risk and Commercial Strategy Management
Cost effectiveness
Appropriate solutions
Business Credibility
Ascertaining legal compliance.
Projects with privacy implications require a full-scale privacy impact assessment (PIA) process.
A small-scale PIA or a large-scale PIA may be conducted depending on the size of the project.
Because projects may be essentially different, a methodology should be devised that fits the
specific requirements of the project, is explicit and as resource-intensive as is appropriate in the
circumstances.
A PIA needs to be distinguished from a data protection audit. Normally, a PIA should not be
conducted on a project that has already been implemented. A PIA is best completed at a stage
where it can genuinely contribute to the development of a project. Carrying out a PIA on an
already existing project runs the risk of raising unrealistic expectations amongst stakeholders
during consultation, unless there is a genuine opportunity to alter the design and implementation
of a project.
A data protection audit is more appropriate for existing projects. An audit is valuable in that it
either confirms that data protection principles are being complied with, or highlights problems
that need to be addressed. A PIA aims to prevent problems from arising. A PIA is broader than an
audit of compliance.
PIAs have been designed as a self-assessment tool for organizations and the Data Protection
Office does not have a formal role in conducting them and/or approving any final report which is
produced. However, the office is available for all assistance required.
There is no legal obligation for any organization to complete a PIA. However, this template has
been developed by the Data Protection Office as a Guide for all data controllers.
9|P age
Identification and assessment of privacy-enhancing alternatives;
Unavoidable negative impacts on privacy should be capable of justification by the
business need that requires them; and
Documentation and publication of the outcomes.
To identify privacy risks to individuals and data protection compliance liabilities for your
organization through the PIA.
To avoid expensive, inadequate “bolt- on” solutions.
The massive increase in the collection, storage, use and disclosure of personal data, and the
advent of
Intrusive technologies, are potentially harmful to individual privacy.
The nature of the personal information. This could include “sensitive personal data” as
defined by the Data Protection Act 2004, but also personal financial information, family
structures, personal email addresses, information about persons considered “at risk”, travel plans
etc.
10 | P a g e
The quality of personal information. This includes characteristics of the information
itself, such as accuracy, relevance and adequacy. The more personal information moves from its
original context, the greater the likelihood it can be misinterpreted. The quality of information
also raises questions about data matching and mining, whether you are matching like with like
and the number of false matches which may be produced.
The meaning behind terms used in personal information. This takes into account that
terms used can be context or sector specific. Variations in meaning of apparently similar terms
may give rise to misunderstandings or error which in turn could result in harm or disadvantage to
the individual. This area would also include examining metadata attached to personal
information.
The retention, deletion and destruction of personal information. How long do your
business needs require retention of information? Are there legal obligations to dispose of or retain
data? Do you need to keep information to counter legal claims or for audit and inspection
purposes? Can your organization make better use of ‘soft deletion’, where after the original
purpose has been met, access to the information is much more tightly controlled until the
organization can permanently delete it?
J. Function creep, beyond the original context of use, in relation to the use of personal
information or the use of identifiers.
K. Registration and authentication processes, including the burden such processes impose,
their intrusiveness, and the exercise of power by government over individuals.
Surveillance, whether audio, visual, by means of data, whether electronically supported or not,
and whether the observations are recorded or not. Location and tracking, whether within
geographical space or on networks, even where it is performed incidentally, and especially where
11 | P a g e
it gives rise to a record. From the perspective of privacy protection, there are considerable
privacy benefits in decentralization rather than centralization. The benefits include:
Where a project involves centralizing information, it is important that there is clear justification.
Further, those who want to use information in a more speculative manner (such as ‘statistical
analysis’, ‘management reporting’ and ‘data mining’) need to be challenged for greater detail,
and to show that benefits will be achievable. Once a case for centralization has been established,
it is necessary to identify, assess and balance the disadvantages.
M. Persons at risk and vulnerable populations, some people, in some circumstances, face
particularly serious risks if their personal data is disclosed. This applies especially to their
physical location or data that may result in disclosure of their physical location. It may also apply
to, for example, health care or financial data. Useful generic terms for people to whom this
applies are ‘persons at risk’ and ‘vulnerable populations.
• politicians;
• entertainers and sportspeople;
• people ‘in the public eye’, such as lottery winners; or
• Those who publicly promote controversial views.
12 | P a g e
Even where physical safety is not under threat, care may still be needed in respect of ‘vulnerable
populations’, some of whom may find it difficult to exercise control over their personal data.
Examples might include younger children or adults who lack capacity to provide consent. Your
organization might also want to consider the difficulties faced by individuals who are homeless
or ex-detainees. Certain health conditions might also put individuals at risk if inappropriately
disclosed.
Once you have identified and assessed the privacy risks your project presents, you need to
consider what action you intend to take in relation to each risk.
13 | P a g e
Accepting the risks
In some instances, because of the nature of the risks, impacts or liabilities, the chances of the
risks being realized or the minimal impact they may have, it might be entirely appropriate to
simply recognize and accept the privacy risks or certain aspects of the privacy risks. However,
this must not be done simply as an alternative to taking action to address risk and must be
considered carefully as an option. If considering this option, ensure that a record of the identified
risk is made, along with the reasons for accepting the risk.
A mitigation measure is a feature that compensates for other privacy intrusive aspects of a design.
A mitigation measure may compensate partially or wholly for a negative impact. Examples
include:
Minimization of personal data retention by not recording it;
Destruction of personal information as soon as the transaction for which it is needed is
completed;
Destruction schedules for personal information which are audited and enforced.
Limits on the use of information which has been collected for a very specific purpose,
with strong legal, organizational and technical safeguards preventing its application to any
other purpose;
The focus of each impact and implication should be identified. For instance, what kinds of
people or organizations will experience the various impacts, and under what
circumstances?
14 | P a g e
The justification for the feature that gives rise to the problem should be examined. For
example, is the privacy infringement proportional to, or appropriately balanced with, any
benefits gained from the infringement? And is it clear that the claimed benefits will
actually arise?
The use of PIA in the UK dates back to at least December 2007, when the ICO published the first
PIA Handbook in Europe. The Handbook was based on research conducted by an
internationally distinguished team led by Loughborough University. Among the PIA analysts in
this team were Professor Colin Bennett (University of Victoria, B.C., Canada) and privacy and
surveillance expert Roger Clarke, a consultant and Professor in Australia. The research team
studied and produced reports on PIA practice and methodology in Australia, Canada, Hong
Kong, New Zealand and the United States in order to identify best practices that could inform
the ICO Handbook, the principal author of which was Clarke. The ICO issued a second
edition of the Handbook in June 2009. It is now working on a third edition, to provide some
research upon which the new version can draw. We understand that the new PIA guidance
will be somewhat shorter and more streamlined than its predecessors. Based on the present
study as well as previous research conducted, especially in the context of the EC -funded
PIAF project as well as our contacts with industry, we concur that a more streamlined guide
is warranted.
The ICO saw PIA as an element in risk management, as the Handbook makes clear. It says that
“organizations may f i n d it appropriate to plan a PIA within the context of risk
management. It also says that the government “will check that they have been carried out as an
integral part of the risk management assessment.
Better integration of PIA with risk management practices has been an issue with other data
protection authorities, as the following paragraphs show, and for quite some time too. In one of
the earliest papers on PIA.
15 | P a g e
1.3 Methodologies
The research on which this report is based uses various approaches and methodologies.
We conducted a literature review of the various project and risk management standards and
methodologies analyzed in this report. An Internet search located 26 UK privacy impact
assessment reports. Its purpose was to determine which project and risk management
standards and methodologies are being used in the UK, whether the recipient organizations have
conducted any PIAs
16 | P a g e
17 | P a g e
CHAPTER-2
2. Project and Technology Development Management Standards and
Methodologies.
This chapter describes popular project management standards and methodologies in use
in the UK and abroad. For each methodology, we provide an overview followed by a
table in which we “interrogate” the methodology using a set of questions derived from
the PIA Handbook touch points. The following table shows how we have converted the
touch points into a set of questions.
Touch points extracted from the ICO Questions for project management
PIA Handbook methodology based on touch points
1 PIAs must comply with (more than just Does the PM methodology include
Data protection) legislation. Private Provisions about compliance with
sector organizations will also have to legislation and any relevant industry
consider industry standards, codes of standards, code of conduct, internal
conduct and privacy policy statements. policy, etc.?
2 PIA is a process. Is the PM methodology regarded as a
process
3 A PIA could consider: Or is itthesimply
Does about producing
PM methodology a only
address
1. privacy of personal information; report?
Information privacy protection or
2. privacy of the person; does it address other types of
3. privacy of personal behavior; and privacy as well?
4. Privacy of personal
communications.
4 PIA should be undertaken when it is Does the PM methodology say that it
Possible to influence the development of should
a project. Be undertaken when it is still
5 Responsibility for the PIA should rest at possible
Does theto PMinfluence the place
methodology
The senior executive level. development offor
Responsibility theitsproject?
use at
the senior executive level?
6 The organization should develop a plan Does the PM methodology call for
For the PIA and its terms of reference. It Developing a plan and terms of
should develop a consultation strategy reference? Does it include a
appropriate to the scale, scope and nature consultation strategy appropriate to
of the project. the scale, scope and nature of the
7 A PIA should include an environmental Does project?the PM methodology call for
Scan (information about prior projects of conduct
a similar nature, drawn from a variety of Of an environmental scan (information
sources). about prior projects of a similar
8 The organization should determine Does nature,the drawn from a include
PM methodology variety of
sources)?
whether a small-scale or full-scale PIA Provisions for scaling its
Is needed. application according to the
scope of the project?
18 | P a g e
A PIA should seek out and engage Does the PM methodology call for
Stakeholders internal and external to the Consulting all relevant stakeholders,
organization. The assessor needs to internal and external to the
make sure that there is sufficient diversity organization, in order to identify and
among those groups or individuals being assess the project’s impacts from their
consulted, to ensure that all relevant perspectives?
perspectives are
Touch points extracted from the ICO Questions for project management
PIA Handbook methodology based on touch points
represented, and all relevant
Information is gathered.
10 The organization should put in place Does the PM methodology include
Measures to achieve clear Provisions for putting in place
communications between senior measures to achieve clear
management, the project team and communications between senior
representatives of, and advocates for, management, the project team and
the various stakeholders. stakeholders?
11 The PIA should identify risks to Does the PM methodology call for
Individuals and to the organization. Identification of risks to individuals
and to the organization?
12 The organization should identify less Does the PM methodology include
Privacy-invasive alternatives. It should provisions for identifying protection
identify ways of avoiding or minimizing measures and/or design solutions to
the impacts on privacy or, where negative avoid or to mitigate any negative
impacts are unavoidable, clarify the impacts of the project or, when
business need that justifies them. negative impacts are unavoidable,
does it require justification of the
13 The organization should document the business
Does the need for them? include
PM methodology
PIA process and publish a report of its Provisions for documenting the
outcomes. process?
14 A PIA report should be written with the Does the PM methodology include
Expectation that it will be published, or provision
at least be widely distributed. The report for making the resulting document
should be provided to the various public
parties involved in the consultation. If (Whether redacted or otherwise)?
information collected during the PIA
process is commercially or security
sensitive, it could be redacted or placed in
confidential appendices, if justifiable.
15 The PIA should be re-visited in each Does the PM methodology call for a
New project phase. review
16 A PIA should be subject to third-party Does If there
theare
PMany changes in the
methodology project?
include
Review and audit, to ensure the Provisions for an audit to ensure
organization implements the PIA that the organization implements
recommendations or, if not all, that it all recommendations or, if not all,
has provided adequate justification for that it has provided adequate
not implementing some justification for not implementing
recommendations. some recommendations?
19 | P a g e
By developing a set of questions based on the PIA Handbook touch points to interrogate the
project management methodology, we can determine whether there are sufficient
commonalities between the PIA process and the project management process so that a PIA
could be conducted in tandem with the project management process without disrupting it.
Further, if there are a sufficient number of commonalities, then we assume that integration of
PIA into the project management process will be possible without much difficulty. If there are
an adequate number of touch points, we assume that it will be easier to convince project
Managers that they should take account of – or integrate – PIA in their project management
process.
Even if there are not so many touch points, there is still a possibility of integrating PIA in the
project management process through one or more “open doors” – i.e., points in the project
management process where or when it would be possible to conduct a PIA.
With its origins as a white paper, and later expanded as the PMI (Project Management Institute)
Project Management Body of Knowledge in the PMI-published PM Network periodical in 1987,
this standard was approved as an ANSI (American National Standards Institute) standard in
1999. On a global basis, 41 per cent of organizations responding to a survey by
PriceWaterhouseCoopers report that PMBOK is the dominant project management
methodology used globally for managing all types of projects. As an indicator of the broad
scope of adoption, PMI reports that more than 650,000 people in 185 countries are members of
PMI and credential holders in one of the areas related to PMBOK.
This standard encompasses a broad range of principles, process groups and knowledge areas for
project management. The processes and knowledge developed and described under this standard
have been written about and amended over several iterations by PMI volunteers, who have
brought expertise from their work in the project management profession. The PMBOK© Guide
acknowledges as well the “plan-do-check-act” cycle, as originally defined by Shewhart in the
1930s and further modified by Deming in the 1950s, as an underlying concept for the
interaction amongst these processes.
20 | P a g e
The process groups (many of which are directly paralleled in ISO 21500, the development to
which PMI contributed) include those described below.
initiating processes, which are associated with the initial definition or authorization of
projects or project phases,
planning processes, which aim to define and/or refine goals and objectives and plan
actions needed to achieve them,
executing processes, where people and resources are brought together to complete the
work that has been planned,
monitoring and controlling processes, which are focused upon measuring and
checking progress against the developed plan, and
Closing processes that end the project or project phase in an orderly fashion, with a
focus upon acceptance of the work performed.
Nine knowledge areas of PMBOK are required for project managers and applied (to a greater or
lesser degree) across the five process groups described above. The knowledge areas
defined and described in the standard include:
Project Integration Management. This knowledge area focuses upon the integration
of processes amongst the project management process groups. Within this knowledge area
are described the development of the project charter, preliminary project scope and the
overall project management plan.
Project Scope Management. This knowledge area includes processes that aim to define
the work of the project and ensure it encompasses all (but only) the work required to complete
the project, as well as to control the scope over the course of the project through an integrated
change control process. The scope of work is defined through a work breakdown structure
(WBS) that deconstructs the work and identifies deliverables
Project Time Management. This knowledge area comprises processes aimed at developing and
managing the overall project schedule, including activity definition and sequencing, estimating
resource and activity duration, and analysis required to develop a schedule from these inputs.
Project Cost Management. This knowledge area includes those processes that support planning,
estimating and controlling project costs. The over-arching aim served by these processes is to
develop the project within its budget. This knowledge area includes concepts of life-cycle
costing, along with value engineering techniques to improve decision-making within the project’s
life in order to optimize quality and performance.
Project Quality Management. This knowledge area includes those processes that provide
for the implementation of quality policies, objectives and responsibilities, implementing the
quality system utilized by the organization, and specifically organizes this through quality
planning, quality assurance and quality control activities. The standard describes and defines
approaches to implement various quality standards and to monitor results to ensure they meet the
quality standards. It provides for continuous improvement through the application of a cyclical
"plan-do- check-act" cycle or other quality improvement initiatives (e.g., TQM, Six Sigma).
Project Human Resource Management. This knowledge area includes processes often referred
to as “soft skills”. The processes include those aimed at organizing and managing the project
team, from human resource planning, defining roles and responsibilities, and staff management
21 | P a g e
planning to acquiring, developing and managing the project team. The processes include
quantitative planning efforts as well as guidance for negotiating for resources, team building,
conducting performance appraisals and other soft management skills.
Project Risk Management. The processes included in the knowledge area are those connected
to planning for, identification of, responding to, monitoring and controlling risk within a project.
Risks are qualitatively and quantitatively analyzed, and risk probabilities and impacts defined. A
risk breakdown structure (RBS) is defined as an output of these processes. Given the uncertain
nature of risk, numerous strategies for identifying and controlling risks are described.
Project Procurement Management. This knowledge area includes the processes for acquiring
or purchasing the products or services needed from sellers outside the project team, and
includes activities for planning purchases and acquisitions and contracting, selecting sellers,
performing contract administration and ultimately closing out contracts.
The methodology provides detailed, structured approaches to address each of the process areas
within the context of each knowledge area, detailing steps to be completed and documents to be
produced. In addition to the PMBOK© Guide, specific separate practice standards are provided
for specific tools, techniques or processes identified in the PMBOK© Guide, including those for
22 | P a g e
Project Risk Management, Earned Value Management, Project Configuration Management,
Work Breakdown Structures, Scheduling, and Project Estimating. In addition,
foundational standards are provided for construction projects and government-based projects as
extensions of PMBOK©.
Of the nine knowledge areas, several should be particularly noted as they may apply to the
integration of PIA:
Project Integration Management. As this knowledge area focuses upon the integration of
processes, and privacy impact assessments may be viewed as looking across the entirety of a
project, introduction of privacy and data protection goals may be determined to be relevant
within the project charter and/or scope.
Project Scope Management. Specific goals for privacy and the conduct of a privacy impact
assessment (or a cyclical implementation of privacy impact assessments over the course of
multiple project phases) could be introduced in the scope of the project as developed and
managed in this knowledge area.
Project Risk Management. Privacy and data protection related risks are assessed via the PIA.
This knowledge area would be appropriate for introducing and defining the tools and techniques
associated with project risk management.
The documents which are produced by the project management professional, and are the focus of
the PMBOK© Guide, are the Project Charter (formally authorizing the project), the Project
Scope Statement (stating the work to be done and deliverables expected), and the Project
Management Plan (indicating how the work will be done).
The PMP accreditation associated with PMBOK© is the most widely held certification for
project managers on a global basis. The certification is issued by the Project Management
Institute (PMI), which also publishes the related standards as A Guide to the Project Management
Body of Knowledge (PMBOK© Guide), currently in its 7th edition (2017).
23 | P a g e
2 Is the PM methodology regarded as a The methodology is a process-
Process or is it simply about producing a driven
report? Approach, which is flexibly
applied across all types and
3 Does the PM methodology address only phases
There isofnoprojects.
explicit focus upon
Information privacy protection or does it privacy.
address other types of privacy as well?
4 Does the PM methodology say that it This is not addressed by the
Should be undertaken when it is still Methodology.
possible to influence the development of
the project?
5 Does the PM methodology place The methodology encourages the
responsibility for its use at the senior inclusion of various types of
Questions for project management Evidence from PMBOK©
methodology based on touch points methodology
Executive level? stakeholders, including executive
levels
Of management,
particularly when initiating
the project and gaining
authorization as well as in
scope definition and
acceptance.
15 Does the PM methodology call for a As there is no explicit call for PIA
Review if there are any changes in the within
project? The methodology, there is likewise
no call for a review. However, the
processes recognize the cyclical
nature of a project with an
integrated change control
Process, which may include its
own criteria for initiation of a
review of privacy issues based
upon the nature of changes.
25 | P a g e
16 Does the PM methodology include No, there is no provision for audit
Provisions for an audit to ensure that the of
organization implements all Changes prescribed by a PIA
recommendations or, if not all, that it has within the methodology, but it
provided adequate justification for not may be that the change control
implementing some recommendations? process should include provisions
for such follow-on validation.
Privacy impact assessments have well-defined goals and can be very effectively integrated
within the PMBOK framework. The main focal point for integration should be within the
Project Risk Management knowledge area, and the PIA should be presented as an available tool
for assessment of privacy risk. In addition, privacy and data protection should be introduced,
along with regulatory and legislative factors as an environmental consideration when
developing the project charter and scope, and in the context of change control.
26 | P a g e
CHAPTER-3
3. The Human Brain project
The Human Brain Project (HBP) is a European initiative to come to a better understanding of the
human Brain, and to enable advances in neuroscience, medicine and future computing
technologies. The vision Of the HBP is to “gain profound insights into what makes us human,
build revolutionary computing Technologies and develop new treatments for brain disorders.
The HBP is one of two so-called flagship projects funded by the EU. It was launched in
October 2013, and it is planned to run for a 10-year period. The project has a total budget of
over 1 billion Euros, and it includes collaborators in more than 20 countries in Europe and
beyond.
The HBP is building six ICT platforms for scientific research. In this background material we
mainly talk about the Medical Informatics Platform (MIP). The MIP allows researcher to ask
questions of personal health data stored in European hospitals. The HBP would like to know how
people in Europe think about the use of their data in research.
Data is a word we use to describe information or knowledge that is represented in such a way that
it allows for storage, usage and processing. Data could be for example, your address, age, gender,
education, blood pressure, sexual orientation and so forth. In themselves, individual pieces of
data might not say a lot about a person, a group or a country. Pieced together however, one starts
to be able to make predictions on the basis of certain correlations between individual data points
or data sets. Such correlations could be used to gain knowledge about the risk of disease in
groups of individuals with certain behavior. The more data one has to begin with, the more
powerful one’s predictions will be.
Researchers are interested in health data, because it provides their research with a lot of power
for prediction and pattern recognition. With more data, researchers hope to be able to gain new
insights into disease, and they hope that such insights will contribute to better healthcare
practices. The access and use of a person’s data is regulated via national law, and EU law and
guidelines. In the following section, we will explain EU regulations and opinions on data
protection.
27 | P a g e
3.2 EU regulations on data protection
In the process of creating, storing and using data there are three different roles: Data subject, data
controller, and the data processor. The data subject is for example the individual about whom the
data is collected. The data controller is the party that stores and controls the data. The data
controller is legally responsible for any breaches of data security or harms coming from use of
the data they control. The data processor is the party that may receive and use data from the data
controller. (European Parliament, 1995; Stationery Office, 1998) It is worth noting, that EU data
regulation is presently undergoing change.
The relevant EU regulations on data protection make sure that no person or institution is allowed
to hold or process your personal data, unless you have given permission for it or if it is required
by other laws (European Parliament, 1995, 2015). However, this protection does not apply when
personal data is considered to have been anonymized. Anonymization is considered successful,
when it would take more effort and resources to retrieve individual data subjects than what can be
reasonably expected.
For example, if re-identifying a data subject from a dataset only requires a normal home
computer, and some simple software, data cannot be considered anonymized. However, if it
would require an office full of scientists, and one of the fastest supercomputers in the world to
perform complex calculations for several weeks, the amount of effort is probably beyond what
can be expected from an attacker; especially when the dataset contains relatively insensitive data.
For research purposes, it is worth noting, that researchers also need to have their research
approved by local research ethics committees before they can carry out their research. In addition
to legal protections of data, there are also moral grounds for protecting health data used in
research. In the next section we go through the key arguments.
Data is being created around us all the time, and the variety of data that exists about most
individuals is extensive. This data does not only range from weight, diagnoses and age in a
clinical record, there exists a lot of data about other people in other places, for example all your
emails, or the people you have called the last year, where you went to school, who your employer
is and everything that relates to you on social media. Because some of this data can be very
sensitive and private the protection of this data, that was just discussed, is considered part of a
person’s right to privacy. This type of privacy is called data privacy.
In this section, we introduce general ethical issues related to research that uses personal data.
The first section develops the argument from the starting point of individual rights, while the
second section starts from considerations of societal benefits.
Apart from considering privacy important in itself, an important common perspective is that
privacy is important because it protects data subjects from potentially negative consequences of
other people having and using their data. An important aspect of such negative consequences is
unlawful discrimination, for example when you are not accepted for a job because you have a
high risk at developing dementia at a young age, when you are not allowed to take out insurance
28 | P a g e
because you suffer from mental instabilities, or when you can’t get a loan because of your
religion (Rose, 2015; The Danish Council of Ethics, 2015).
Naturally, not all types of information about a person are equally sensitive. However, information
about a person’s health, and other data used and produced in research, are typically considered
sensitive information.
This is a type of information that can affect a person’s ability to change the course of their life, to
get a job or to form new relations. The EU also considers data about race, ethnic origin, political,
religious –or philosophical beliefs, or information about memberships of unions, health or sexual
orientation as sensitive information that requires special protection.
In medical research, an individual’s right to make one’s own life choices is secured through
informed consent. This right is considered to be important because it allows people live their
lives based on their own values. For example, people might have different ideas about the types
of research they would like to support. Following this line of thinking, individuals should have
the opportunity to decide what types of research their data is used for.
Furthermore, participation in research is not always without risks, while the outcome of the
research may not necessarily benefit the participants.
The general idea of informed consent is that every time a researcher wishes to use data from
individual persons for research the researcher has to inform the test subject of all the relevant
details of a research project, and then ask for the individual’s permission to use their data for the
research. At present, this is exactly how informed consent is structured.
This means that scientists need their data subjects to sign an informed consent form for every
separate study they do, which is generally regarded as a quite a hassle; limiting the amount of
available data and slowing down research.
29 | P a g e
CHAPTER-4
Risk management is used in many areas (information security, safety, and finance, insurance).
This chapter provides an implementation of this approach in the context of privacy. The
methodology presented below is fully compliant with international standards for risk
management. It naturally fits into global risk management approaches.
In the area of privacy, the only risks to consider are those that processing of personal data pose to
privacy. Those risks are composed by one feared event (what do we fear?) and all the threats that
make it possible (how can this occur?)
Processes: those of the processing (its features as such, in so far as they deal with
personal data) and those required by [Act-I&L] in order to inform the data subjects (Article 32),
obtain their consent (if appropriate, Article 7) and allow the exercise of the rights of opposition
(Section 38), access (Article 39), correction and deletion (Article 40);
Personal data: Those directly concerned by the processing and those concerned by the
processes required by [Act-I&L].
Indeed, occurrence of such events would have impacts on the privacy of data subjects, human
identity, human rights or civil liberties.
The feared event describes the situation and the potential impacts in the considered context.
30 | P a g e
Applications for social benefits disappear, thus depriving the beneficiaries of the said
benefits and forcing them to repeat their administrative formalities.
For a feared event to occur, there must be one or more risk sources causing it, accidentally or
deliberately. Risk sources may include:
Risk sources will act, accidentally or deliberately, on the various information system
components, on which the primary assets rely. These supporting assets may include:
The action of the risk sources on supporting assets may happen through different threats:
Function creep: supporting assets are diverted from their intended context of use without
being altered or damaged;
Espionage: supporting assets are observed without being damaged;
Exceeded limits of operation: supporting assets are overloaded, over-exploited or used
under conditions not permitting them to function properly;
Damage: supporting assets are partially or completely damaged.
Changes: supporting assets are transformed;
Property losses: supporting assets are lost, stolen, sold or given away, so it is no longer
possible to exercise property rights.
Examples of threats
31 | P a g e
A risk is a scenario that describes how risk sources might exploit supporting assets vulnerabilities
leading to cause an incident on primary assets and impacts on privacy. The risk level is estimated
in terms of severity and likelihood. Severity represents the magnitude of a risk.
It essentially depends on the level of identification of personal data and the level of consequences
of the potential impacts. Likelihood represents the feasibility of a risk to occur. It essentially
depends on the level of vulnerabilities of the supporting assets facing the level of capabilities of
the risk sources7 to exploit them. The following figure makes the synthesis of the above-
mentioned notions:
Using a risk management method is the safest way to ensure objectivity and relevance of the
choices to make when setting up a processing of personal data.
To assess the risks, feared events have to first be identified and estimated in terms of severity.
Then, for those whose severity is high, threats that could lead to the feared events have to be
identified and their likelihood estimated. The assessed risks can therefore be treated through
proportionate measures. The risks thus assessed can then be treated using commensurate
measures.
32 | P a g e
The approach consists in studying:
1. The context of the processing of personal data,
2. The feared events in this particular context,
3. The possible threats (if needed),
4. The risks involved (if needed),
5. The appropriate measures to treat them.
33 | P a g e
CHAPTER-5
This chapter describes the approach to be taken in order to analyze the risks posed to privacy by
the processing of personal data. It describes how to use the [EBIOS] method in the specific
context of data protection.
EBIOS (Expression of Needs and Identification of Security Objectives) is a method for analysis,
evaluation and action on risks relating to information systems. It generates a security policy
adapted to the needs of an organization. The method was created in 1995 and is now maintained
by the ANSSI, a department of the French Prime Minister.
The five steps of the EBIOS method are:
Circumstantial study - determining the context;
Security requirements;
Risk study;
Identification of security goals; and
Determination of security requirements.
EBIOS is primarily intended for governmental and commercial organizations working with the
Defense Ministry that handle confidential or secret defense classified information. It enables
well informed security actions to be undertaken. The objective is to assess and prepare for
possible future situations (in the case of a newly created information system), and identify and
respond to deficiencies (when the system is operating) in order to refine the security
arrangements.
In its first version, EBIOS was focused on “security objectives redaction”. Since 2000, DCSSI
became aware of improvements in international standards (ISO in particular) and “engaged
EBIOS adaptation to this criteria”. It might also be viewed as a way to avoid France’s
introspective approach to information security, responding to the limitations of French methods
that are not recognized abroad and are unsuited to international markets. However, the method's
documentation only appears to be available in French.
In 2002, international comparisons placed EBIOS among the three best methods for analyzing
ISS risks. Many organizations in the public and private sectors use the method to conduct their
own ISS risk analyses.
The aim at this stage is to gain a clear view of the scope under consideration by identifying all the
useful information for risk management by answering the following questions:
34 | P a g e
Which primary assets need to be protected?
What are the relevant sources of risk that might affect the specific context of the processing
operation under consideration?
The aim of this step is to obtain a detailed and prioritized list of all feared events that may affect
the processing operation under consideration. An example is provided in the table on page 14.
Clarifying feared events requires identifying their potential impacts. In other words, what
consequences could each feared event have on the identity and privacy of data subjects and
human rights or civil liberties if:
35 | P a g e
The legal processes were unavailable?
The processing operation was modified?
An unauthorized person accessed personal data?
Personal data were modified?
Personal data disappeared?
These feared events are ranked by determining their severity based on the level of identification
of personal data and the prejudicial effect of these potential impacts.
First of all, the level of identification of all personal data (identified beforehand) must be
assessed. In other words, how easy is it to identify data subjects?
b) Limited: Identifying an individual using their personal data appears to be difficult but is
possible in certain cases (e.g. searching throughout the French population using an individual's
full name).
Next, the prejudicial effect of each feared event should be estimated. In other words, how much
damage would be caused by all the potential impacts?
a) Negligible: Data subjects either will not be affected or may encounter a few
inconveniences, which they will overcome without any problem (time spent reentering
information, annoyances, irritations, etc.).
b) Limited: Data subjects may encounter significant inconveniences, which they will be able
to overcome despite a few difficulties (extra costs, denial of access to business services, fear, lack
of understanding, stress, minor physical ailments, etc.).
c) Significant: Data subjects may encounter significant consequences, which they should be
able to overcome albeit with serious difficulties (misappropriation of funds, blacklisting by
banks, property damage, loss of employment, subpoena, worsening of state of health, etc.).
36 | P a g e
The value of the level that best matches the potential impacts identified is then selected. Any
existing or planned measures that make these potential impacts less harmful should be listed as
justification as shown in the table.
Finally, the severity is determined by adding the respective personal data level of identification
and prejudicial effects of potential impacts values obtained and locating the sum in the table
below:
Option: The severity level thus obtained may be raised or lowered by including additional
factors. For example, a large number of data subjects (which can open the door to a massive
damaging event) may raise the level of severity by one. A large number of interconnections
(especially with foreign sites) or recipients (which facilitates the correlation between originally
separated personal data) might also be considered as an aggravating factor. Conversely, a very
small number of data subjects or very few or no interconnections or recipients might lower the
severity level by one.
The aim of this step is to obtain a detailed, prioritized list of all threats24 that may allow feared
events to occur. It is possible to leave out threats relating to feared events of negligible (1) or
limited (2) severity. An example is provided in the table.
Since a threat is a possible action by risk sources on supporting assets, the supporting assets
should be identified and estimated for each threat.
First, the vulnerabilities of the supporting assets are estimated for each threat. In other words, to
what degree can the properties of supporting assets be exploited in order to carry out a threat?
37 | P a g e
Negligible: Carrying out a threat by exploiting the properties of supporting assets does
not appear possible (e.g. theft of paper documents stored in a room protected by a badge reader
and access code).
Limited: Carrying out a threat by exploiting the properties of supporting assets appears to
be difficult (e.g. theft of paper documents stored in a room protected by a badge reader).
The value of the level that best matches the supporting asset vulnerabilities identified is then
selected. Any existing or planned measures that reduce the vulnerabilities of supporting assets
should be listed as justification as shown in the table.
Finally, the likelihood of the threats is determined by adding the values obtained for the
vulnerabilities of the supports and the capabilities of the risk sources and locating the sum in the
table below:
Option: The likelihood thus obtained may be raised or lowered by including additional factors.
For example, access to the Internet, exchanges of data with foreign sites, interconnections with
other systems and a high degree of system heterogeneity or variability may raise the likelihood
by one level. Conversely, a homogeneous, stable system that has no interconnections and is
closed off from the Internet may lower the likelihood by one level.
Tool
The result of this step can be added to the feared events table from the previous step:
38 | P a g e
5.3 Risk study: What is the risk level?
This step may be skipped if the severity level is negligible (1) or limited (2).
The aim of this step is to obtain a risk map in order to determine the order in which they should
be treated.
Since a risk consists of a feared event and all the threats that may allow it to occur:
Its severity equals that of the feared event,
Its likelihood equals the highest likelihood value of the threats associated with the feared event.
39 | P a g e
Option: Objectives may be set based on where risks are located on the map (in order of priority):
1. Risks with a high severity and likelihood, absolutely must be avoided or reduced by
implementing security measures that reduce both their severity and their likelihood. Ideally, care
should even be taken to ensure that these risks are treated by independent measures of prevention
(actions taken prior to a damaging event), protection (actions taken during a damaging event) and
recovery (actions taken after a damaging event).
2. Risks with a high severity but a low likelihood must be avoided or reduced by
implementing security measures that reduce either their severity or their likelihood. Emphasis
must be placed on preventive measures.
3. Risks with a low severity but a high likelihood must be reduced by implementing
security measures that reduce their likelihood. Emphasis must be placed on recovery measures.
4. Risks with a low severity and likelihood may be taken, especially since the treatment
of other risks should also lead to their treatment.
Risk is an inherent part of all human activities so, not surprisingly, assessing risk and making
decisions about how to avoid or minimise it are activities fundamental to human existence.
Whether evaluating whether to walk down an unfamiliar street at night or undergo a medical
procedure, the process of assessing and managing risk is so fundamental and engrained that
individuals do it intuitively and often without any conscious awareness. Not surprisingly, risk
management also has become a critical component of most institutional activities as well.
Deciding what to buy or sell, whom to hire and where to locate are just a few examples of the
many decisions that are based on an evaluation of the risks and benefits involved. As
PricewaterhouseCoopers has noted in its Practical Guide to Risk Assessment, identifying and
managing risk are “increasingly important to the success and longevity of any business”. 1 In
recent years, many countries have enacted laws and regulations requiring or encouraging more
40 | P a g e
formal risk management. Today formal, documented risk assessments and other risk management
tools are required in an expansive range of laws ranging from workplace safety to financial
reporting. Along with these legal requirements has come a professional practice of risk
management, including specialised research, international and sectoral standards, a common
vocabulary and agreed-upon principles and processes. Data protection has long relied on risk
management as a critical tool for ensuring that data are processed appropriately and that the
fundamental rights and interests of individuals are protected effectively. Risk management has
become an increasingly prominent feature of legal requirements over the past two decades. Even
beyond those legal requirements, however, organisations have employed risk management as a
logical, familiar and effective tool for protecting privacy. Risk management does not alter rights
or obligations, nor does it take away organisational accountability. To the contrary, it is an
integral part of accountability and what accountable organisations should be doing. It has proven
a valuable tool for calibrating accountability, prioritising action, raising and informing awareness
about risks, and identifying appropriate mitigation measures. Furthermore, it is especially
valuable as a step towards greater interoperability in the face of divergent national and sectoral
legal requirements, helping organisations to manage compliance on a more global basis as they
work with regulators to identify mutually accepted approaches and values, thus driving common
outcomes, despite the lack of common legal rules. Data protection regulators themselves are also
increasingly employing risk management as a way of targeting scarce resources where they are
most needed and can have the greatest impact. Yet risk management in data protection, whether
undertaken by businesses or regulators, has often been informal and unstructured and failed to
take advantage of many of the widely accepted principles and tools of risk management in other
areas. In addition, risk management in the field of data protection has suffered from the absence
of any widely accepted framework of harms or negative impacts and so, at best, has been
idiosyncratic and, at worst, has not taken into account the full range of risks to individuals. As a
result, despite many examples of specific applications, risk management still does not achieve its
full potential as a critical tool in data protection practice and law. In January 2014, the Centre for
Information Policy Leadership launched a multiyear project on the role of risk management in
data protection. This project elaborates on the Centre’s earlier work on organisational
accountability, particularly in seeking to develop the analytical framework and tools needed to
implement key aspects of accountability. The Centre’s risk project is designed to help “bridge the
gap between high- level privacy principles on the one hand, and compliance on the ground on the
other”. 2 In its first paper, A Risk-based Approach to Privacy: Improving Effectiveness in
Practice, the project sought to understand “what is meant by privacy risks to individuals (and
society) and to create a practical framework to identify, prioritise and mitigate such risks so that
principle-based privacy obligations can be implemented appropriately and effectively”. 3 In this
paper, the project addresses the role of risk management—the systematic process of identifying
and assessing risks, avoiding or mitigating them where possible, and then accepting and
managing the remaining risks—in data protection as implemented into legal requirements,
interpreted by regulators and put into practice by responsible organisations. This paper highlights
the growing consensus around risk management as an essential tool for effective data protection,
and addresses key considerations that affect the role of risk in data protection law and practice. A
draft of this paper was provided to the participants in the Centre’s second workshop on the
Privacy Risk Framework and the Risk-based Approach to Privacy, held in Brussels on 18
November 2014. This final version reflects both the thoughtful comments of those participants4
and the wide-ranging discussion at the workshop.
41 | P a g e
The aim of this step is to build a protection system that (i) allows risks to be treated in a
commensurate manner, that (ii) complies with [Act-I&L] and (iii) is consistent with the data
controller's requirements (legal, financial, technical, etc.).
First of all, risk-treatment measures must be determined. This is done by linking existing or
planned measures (identified earlier in the study or the applicable guidelines) to the risk(s) they
help to treat. Subsequent measures are added until the risk level is finally considered acceptable.
Notes:
The higher the capabilities of the risk sources, the more robust measures must be in order to
withstand them. Moreover, any incidents that may have already occurred (especially personal
data breaches) as well as any difficulties in implementing certain measures, may be used to
improve the security system.
Measures specified must be formally set out, implemented, regularly audited and continually
improved.
Finally, explanations about why residual risks may be accepted should be given. These
explanations may be based on the new severity and likelihood levels and on the benefits offered
by the processing operation identified previously (risk-benefit analysis) by applying the
following rules:
42 | P a g e
It may be acceptable to depart from these rules, but only if it is demonstrated that the benefits of
processing greatly outweigh the risks.
Note:
Serious risks may thus be taken if their likelihood is sufficiently low. Certain risks may also be
taken if processing makes it possible to save human lives.
Tool:
The result of this step, which consists in presenting the measures selected to treat each risk and in
re-estimating the severity and likelihood of each risk, may be summarized in a table such as the
one below
The following table presents the generic threats that can lead to:
i. Illegitimate access to personal data,
ii. Compromise of processing (if this feared event is considered).
43 | P a g e
Threats that may jeopardize integrity
The following table presents the generic threats that can lead to:
i. Changes in processing,
ii. Unwanted changes of personal data,
iii. Alterations to legal processes (if this feared event is considered).
44 | P a g e
Threats that may jeopardize availability
The following table presents the generic threats that can lead to:
i. Unavailability of legal processes,
ii. Disappearance of personal data,
iii. Unavailability of processing (if this feared event is considered).
45 | P a g e
46 | P a g e
47 | P a g e
CHAPTER-6
6. Risk Management in Data Protection Regulation
Companies are subject to hundreds of laws and regulations requiring that they identify, assess
and manage risks. Many of these requirements, for example, Sarbanes-Oxley and the broad
obligations on publicly traded companies to identify and disclose in their quarterly or annual
filings material risks, are longstanding. Others, such as Basel III and the numerous national
requirements imposed on financial institutions to assess and avoid or otherwise respond to risks
to their solvency, have been enacted or strengthened more recently.
Managing Information Security Risk (SP 800-39, 2011), published by the US National Institute of
Standards and Technology (NIST), is congruent with, and complementary to, NIST 800-30 (2012)
and guidance on other areas of organizational risk management as part of an Enterprise Risk
Management (ERM) Programme. ISO 31000 is cited. Although the writing is wholly new (albeit with
48 | P a g e
some repetition of diagrams), there are considerable overlaps with 800-30, although the latter focuses
more on risk assessment and 800-39 is more holistic and emphasizes other aspects of risk
management. Neither of these NIST publications embraces privacy or data protection as an important
element, and almost completely ignore it. Because of this close relationship between the two
documents, many details of 800-30 that area described elsewhere in this report will not be repeated
here. However, 800-39 develops or emphasizes certain elements, explains certain items at greater
length, or introduces a number of new and partly different ones. The following are probably the most
important different emphases:
The main purpose, as in 800-30, is information security. Many types of organisational risk are
identified: “program management risk, investment risk, budgetary risk, legal liability risk, safety risk,
inventory risk, supply chain risk, and security risk”. Privacy risk is absent. “Risk” is defined for
present purposes as “information security risk from the operation and use of organizational
information systems including the processes, procedures, and structures within organizations that
influence or affect the design, development, implementation, and ongoing operation of those
systems.” The document emphasises that this must be a matter for senior executives and leaders, and
not confined to a technical “stovepipe” in the organisation, separate from general management.
Senior personnel are therefore given risk management responsibilities and are to be accountable for
their risk management decisions.
There is also an emphasis on “tools, techniques, and methodologies” to be identified for assessing,
developing courses of action, and determining the sufficiency, correctness and effectiveness of risk
responses. As in 800-30, 800-39 analyses the processes and activities at the three organisational tiers,
and adopts the fourfold frame-assess-respond-monitor risk-management process concept. A new
concept is that of risk executive (function).
This is established at the top (organisational) tier as a crucial part of the governance and decision-
making structure for risk management; it “serves as the common risk management resource for senior
leaders/executives, mission/business owners, chief information officers, chief information security
officers, information system owners, common control providers, enterprise architects, information
security architects, information systems/security engineers, information system security
managers/officers, and any other stakeholders having a vested interest in the mission/business success
of organizations.
Risk tolerance is an important element of risk framing, and indicates “the level of risk or degree of
uncertainty that is acceptable to organizations”, constraining risk management decisions and shaping
oversight, the rigour of the risk assessment, and the responsive strategies adopted. The document
explains enterprise and information security architectures at length in its discussion of Tier 2
(mission/business process). These architectures have much to do with the organisation’s resilience to
threats. In particular, the information security architecture “incorporates security requirements from
49 | P a g e
legislation, directives, policies, regulations, standards, and guidance”. The description of enterprise
architecture includes “privacy” as one of the risk-reduction aims for the full, organisation-wide
integration of management processes, but this is not explained.
The concepts of trust and trustworthiness are deemed important factors in risk decision-making, with
“trust” defined as “a belief that an entity will behave in a predictable manner in specified
circumstances. The entity may be a person, process, object or any combination of such components.”
An Appendix sets out a number of trust models as alternative ways for organizations to obtain levels
of trust needed to form partnerships and collaborations and to share information. Trustworthiness
relates to assurance about IT products and systems in the 101 face of threats, and susceptibility to
attack shapes the acceptability of levels of risk. Organisational culture (values, beliefs and norms
influencing behaviour and action) is a dimension that 800-39 treats at length, as it affects many if not
all the other elements of risk management. Where the cultures of two organisations differ, or where
parts of the same organisation have different cultures, these “disconnects” may be palpable in terms
of information-sharing: “An example of an internal disconnect can be observed in a hospital that
emphasizes different cultures between protecting the personal privacy of patients and the availability
of medical information to medical professionals for treatment purposes.” We may note that this is an
almost isolated mention of “privacy” in 800-39, and that the example is a classic data protection issue
that PIA would encounter in its analysis of an organisation’s processes. But 800-39 offers no guide to
the resolution of such clashes of culture and the information-sharing decisions that are implicated. A
section on the relationship among all the key risk concepts (governance, risk, tolerance, trust, culture
and investment strategy) then follows, showing their inter-relationship and the importance of the risk
executive (function)’s cognizance of this.
NIST 800-39 moves on to discuss the process for managing risk through the familiar stages of
framing, assessing, responding and monitoring, describing each with more fine-grained sub-
processes. This analysis goes beyond 800-30’s focus on risk assessment to describe more fully the
stages of responding to risk and risk monitoring, including several steps in each. There is a large
Appendix that delineates the roles and responsibilities of key organisational participants. Although
they are not here referred to as “stakeholders”, many if not all of them are elsewhere so described.
These roles include: CEO, risk executive (function) – an individual or a group, CIO, information
owner/steward, senior information security officer, authorizing official, authorizing official
designated representative, common control provider, information system owner, information system
security officer, information security architect, information system security engineer, and security
control assessor. If, through an “open door”, a PIA were to be grafted into the risk management
process covered by 800-39, these personnel and their differing but overlapping responsibilities, and
perhaps their differing cultures (and what those cultures might indicate with regard to information
processes that bear upon privacy) would have to be factored into the PIA routine.
50 | P a g e
3 Does the RM methodology address NIST 800-39 barely mentions privacy and
only information privacy protection orthe example it mentions is of information
does it address other types of privacy as
privacy. Broadening could perhaps be done
well? within the scope of the RM, but adopting a
conception of privacy that went beyond
information security would be a prerequisite
for the organisation.
4 Does the RM methodology say that it The RM exists at all stages of a project and
should be undertaken when it is still Continuously.
This is an elaborate document that, read together with NIST SP 800-30, gives a highly detailed
and elaborate descriptive guide to risk management in all its stages, procedures, structures and
thought-processes. As with 800-30, but perhaps to a lesser extent, there may be “touch points”,
“open doors”, and other affordances in NIST 800-39 and in the PIA Handbook that could
be worth developing. Although hardly any mention is made of privacy, the specific focus of 800-
39 on security risk should not rule this out, especially if 800-30 is implemented in conjunction
with it and if the latter can be oriented more firmly towards PIA.
If PIA can be inserted into the security concerns of 800-39, PIA responsibility could be grafted
onto the role of “risk executive (function)” in the governance and decision-making structure for
risk management. The emphasis on organisational culture, and the example of cultural
“disconnect” between attitudes towards data-sharing, could be a doorway for helping
organisations resolve such dilemmas through the analysis that PIA would bring to these
situations. In addition, the “stakeholder” framework could be adapted to PIA purposes
ISACA (originally known as Information Systems Audit and Control Association) originated
in 1969 as the EDP Auditors Association. Since those origins, the members of ISACA, who
52 | P a g e
serve in a variety of IT-related positions, are found in 190 chapters in over 180 countries, and
currently exceed 100,000 in number. ISACA established a research affiliate, the IT
Governance Institute (ITGI), in 1998. The focus of the organisation is upon developing
knowledge around information systems assurance, control, and security, as well as
governance of IT and related risk and compliance issues.
COBIT (Control Objectives for Information and related Technology), originally published in
1996 and now released in version 5, is a process framework for IT and encompass
frameworks for value of IT business investments (Val IT) and for risk management
(Risk IT). COBIT, like other IT governance frameworks, focuses upon the efficient and
effective use of IT assets, and includes the following key areas: strategic alignment,
value delivery, risk management, resource management, and performance management.
IT Governance Model
COBIT itself is a framework and does not aim to provide in-depth guidance on every aspect of
managing and governing IT. COBIT refers users of the framework to other more detailed
standards such as ITIL (for service delivery), CMM (for solution delivery), ISO 17799 (for
information security) and PMBOK or PRINCE2 (for project management). Over time, more
than 40 international IT standards, frameworks, guidelines, etc. have been consulted for the
development of COBIT, including notably those published by COSO, OGC, ISO, SEI, PMI, and
ISF. The COBIT framework ties together business requirements with IT processes and IT
resources:
Business IT IT
requirements processes resources
53 | P a g e
Effectiveness Domains Applications
Efficiency Processes Information
Confidentiality Activities Infrastructure
Integrity People
Availability
Compliance
Reliability
The process model for COBIT comprises four domains with 34 generic processes aimed at
“managing the IT resources to deliver information to the business according to business and
governance requirements”. The four domains are 1) plan and organise, 2) acquire and
implement, 3) deliver and support, and 4) monitor and evaluate. The COBIT framework
provides a process description, control objectives, management guidelines and a maturity
model for each distinct process within these domains.
The process description indicates which IT process is controlled, how it satisfies business
requirements, and how it is achieved and measured. The process is decomposed into a series
of specific activities. The management guidelines define which processes provide inputs, and
which outputs are created by the process. A RACI (Responsible, Accountable, Consulted, or
Informed) chart is provided for each activity in the process and goals and metrics for the
process are established.
Within the “plan and organise” domain, 10 processes are described. They concern defining a
strategic IT plan; defining the information architecture; determining the technological
direction; defining the IT processes; organisation and relationships; managing the IT
investment; communicating management aims and direction; managing IT human resources;
managing quality; assessing and managing IT risks; and managing projects. Key areas where
privacy and data protection elements may be introduced are within the following activities:
AI1.2 - Risk Analysis Report AI2.1 - High-level Design AI2.2 - Detailed Design
AI2.3 - Application Control and Auditability
54 | P a g e
AI3.2 - Infrastructure Resource Protection and Availability
AI6.2 - Impact Assessment, Prioritization and Authorization
The “deliver and support” domain comprises 13 processes. These processes include defining and
managing service levels; managing third-party services; managing performance and capacity;
ensuring continuous service; ensuring systems security; identifying and allocating costs;
educating and training users; managing service desk and incidents; managing the configuration;
managing problems; managing data; managing the physical environment; and managing
operations. Key areas where privacy and data protection elements may be introduced are within
the following activities:
55 | P a g e
Of particular interest in this context is Risk IT, which was originally published in 2009, based
upon the then current version of COBIT (4.1). “The Risk IT framework is based on the
principles of enterprise risk management (ERM) standards/frameworks such as COSO ERM
and AS/NZS 4360 (soon to be complemented or replaced by ISO 31000) and provides insight
on how to apply this guidance to IT.” The process model presented under Risk IT includes
three domains: risk governance, risk evaluation, and risk response. In turn, each of these
domains includes three defined processes
The following examines COBIT and Risk IT within the context of how they relate to the key
touch points for PIA, and how and where PIA may fit into the framework as it currently
exists.
56 | P a g e
Touch point questions Evidence from COBIT
1 Does the RM methodology include In COBIT, Process ME3.3 – Evaluation of
Provisions about compliance with Compliance with External Requirement
legislation and any relevant industry provides for this type of review.
standards, code of conduct, internal
policy, etc.?
2 Is the RM methodology regarded as a It is a framework that supports the
Process or is it simply about producing Application of other risk management
a report? methodologies, and provides in that context a
strategic approach to risk, which is cyclical
in nature.
3 Does the RM methodology address only It is expansive and addresses a broad range
information privacy protection or does of risks that may be applicable. Privacy is
it address other types of privacy as not specifically identified, but is included
well? within the approaches taken for ensuring
compliance.
4 Does the RM methodology say that it It is aimed at tying business value to IT
should be undertaken when it is still processes, including those related to risk
possible to influence the development management. As such, risks are
of the project? contemplated in the earliest stages of a
project or programme and continually
evaluated and responded to.
5 Does the RM methodology place Yes. IT risk management defined within the
responsibility for its use at the senior COBIT and Risk IT frameworks is driven by
executive level? a governance model that relies upon a
definition of risk appetite/tolerance at
strategic levels in the organisation (i.e.,
Board or most senior level), and integrates
with enterprise-level risk management.
6 Does the RM methodology call for COBIT calls for strategic planning in the
developing a plan and terms of Plan and Organize domain, and Risk IT
reference? Does it include a establishes activities to be pursued in the
consultation strategy appropriate to the Risk Governance domain, each involving a
scale, scope and nature of the project? broad range of stakeholders.
7 Does the RM methodology call for While there is no explicit call for an
conduct of an environmental scan environmental scan, one of the four domains,
(information about prior projects of a “Monitor and Evaluate”, primarily focuses
similar nature, drawn from a variety of upon external regulatory and compliance
sources)? issues, and should typically lead to such a
generalised environmental scan.
8 Does the RM methodology include No.
provisions for scaling its application
according to the scope of the project?
57 | P a g e
Touch point questions Evidence from COBIT
9 Does the RM methodology call for Within the “Plan and Organize” domain, the
Consulting all relevant stakeholders, Activity PO10.4 is aimed at ensuring all
internal and external to the organisation, stakeholders are engaged and provide inputs
in order to identify and assess the to the definition and execution of the project.
project’s impacts from their
perspectives?
10 Does the RM methodology include Process PO6 (within the “Plan and Organize”
Provisions for putting in place measures Domain), Communicate Management Aims
to achieve clear communications and Direction, includes the activity PO6.5,
between senior management, the project Communication of IT Objectives and
team and stakeholders? Direction. This activity ensures that all
stakeholders are provided with an awareness
and understanding of business and IT
objectives and direction.
11 Does the RM methodology call for It defines the processes related to the
Identification of risks to individuals and Identification of risk within the PO9 “Assess
to the organisation? and Manage IT Risks” process and its related
activities. In addition, these processes are
defined in more detail in the related Risk IT
framework.
12 Does the RM methodology include It calls for high-level and detail design
provisions for identifying protection (AI2.1 and AI2.2) to be completed within the
measures and/or design solutions to context of the organisation's technological
avoid or to mitigate any negative direction and information architecture, which
impacts of the project or, when negative standards should be defined to avoid
impacts are unavoidable, does it require negative impacts.
justification of the business need for
them?
13 Does the RM methodology include Numerous artefacts are expected to be
Provisions for documenting the process? Produced within the framework, enabling
communication of outputs from one process
as inputs to other processes, creating
effective linkages of the business and IT
processes within the various domains.
14 Does the RM methodology include No. There is no discussion of
Provision for making the resulting Communication outside of the defined
document public (whether redacted or stakeholders.
otherwise)?
15 Does the RM methodology call for a Risk management is viewed as a continuous
Review if there are any changes in Cycle and is applied to both projects and
the project? ongoing IT services.
58 | P a g e
16 Does the RM methodology include In the “Monitor and Evaluate” domain, the
Provisions for an audit to ensure that the Activity ME3.4, Positive Assurance of
organisation implements all Compliance, is aimed at ensuring that “any
recommendations or, if not all, that it corrective actions to address any compliance
has provided adequate justification for gaps have been taken by the responsible
not implementing some process owner in a timely manner.”
recommendations?
For the purpose of identifying a window for inclusion of PIAs within the COBIT framework,
our assessment leads us to believe that many of the key elements of PIA are implicitly
included in the framework, especially with respect to the processes in the “Monitor and
Evaluate” domain, which calls for adherence with external compliance and regulatory factors.
Moreover, as a framework, where COBIT relies upon other standards such as ITIL, ISO
31000, COSO, and others, inclusion of PIA within those other standards will necessarily roll-
up into the processes observed by COBIT user organisations.
This risk management method was created in 1995 by the Agence Nationale de la Sécurité
des Systèmes d'Information (ANSSI),108 the French Network and Information Security
Agency (FNISA), and was first released in 1997.109 Since then, there have been two major
updates: in 2004 and in 2010. Among other improvements, the revisions have introduced
better compatibility with international standards on information security management and risk
management, namely ISO 27001, ISO 27005, ISO Guide 73 and ISO 31000.
To date, the EBIOS method is only available in French; however, an English version is
awaiting approval and should be available soon. As such, EBIOS is mainly used in France,
where it is recommended for public administrations and for private companies that are
carrying out contracts for the Defence Ministry or that have strong needs in terms of
information security.
EBIOS is also used abroad in French-speaking countries, and ENISA has drawn on
EBIOS. The use of EBIOS is suitable for various types of structure, ranging from small and
medium-sized companies and local authorities to multi-national companies as well as
international organisations. Since 2006, EBIOS has been supported by the “Club EBIOS”,
which is a user group, independent of ANSSI, formed by public and private sectors
organisations as well as individual experts.
59 | P a g e
It is a kind of tool-box which comes as a set of two main documents. The 97-page Risk
Management Method gives an overview of risk management and then focuses on
information security (Chapter 1). It explains what EBIOS is and how it works (Chapter 2),
and describes each of the activities that make up the approach (Chapter 3). A demonstration of
the coverage of international standards.
Within EBIOS, an information security risk is a combination of the following four elements:
A threat source,
A threat,
A vulnerability,
An impact.
Thus, EBIOS focuses on the identification of those four elements as well as on the proposal of
various scenarios that combine them in likely ways. Through this, EBIOS allows the risk
manager to assess and treat risks. It also provides all the necessary elements for
communication within the organisation and its partners as well as the validation of risk
treatment.
EBIOS is an iterative method suitable for producing many types of deliverables ranging from
an organisation’s information security policy and a security strategy to a risk map or a
treatment plan. Since its last release in 2010, EBIOS has been restructured into five modules
to comply with the requirements of ISO 27001, ISO 27005 and ISO 31000. Figure 3.4 below
shows the organisation of those modules as a five-step process.
60 | P a g e
CHAPTER-7
It defines PII as any information that can be used to identify a PII principal (a person or a “data
subject”, to use EC terminology) or that might be linked to a PII principal, either directly or
indirectly. It defines privacy principles in terms of PII, so the standard does not address all types
of privacy. Organizations can use the framework to help define their “privacy safeguarding
requirement”. The framework describes such requirements and lists privacy principles based on
other well-known guidance documents. The standard can also support other privacy
standardization activities, such as privacy risk assessments and controls.
The standard comprises five sections, one annex and a bibliography. Section 2, on definitions,
includes an interesting note that equates a privacy impact assessment with a privacy risk
assessment, which it defines as the “overall process of risk identification, risk analysis and risk
evaluation with regard to the processing of personally identifiable information”. The definition
does not include stakeholder consultation or even finding solutions to privacy risks.
It focuses on the basic elements of a privacy framework. It discusses actors and roles,
interactions, recognizing PII, privacy safeguarding requirements, privacy policies and controls.
It identifies four types of actors involved in processing PII, namely, the PII principals,
controllers, processors and third parties. It says a privacy principal does not always have to be
identified by name. These different actors (stakeholders) can interact with each other in a variety
of ways. The standard includes a table with several different scenarios showing possible
information flows between the PII stakeholders (actors). It clarifies how information can be
considered PII, e.g., if the information has an identifier that refers to a person, and it regards as
61 | P a g e
PII any information that distinguishes one person from another (e.g., biometric data). The
standard makes the point that it may be possible to identify someone even if there is no
single attribute that uniquely identifies her. A combination of two or three or more attributes
may be enough uniquely to identify the person. It provides a long list of example attributes
that can be used to identify a person.
Privacy safeguarding requirements may arise whenever an organization processes PII – e.g.,
in the collection, processing and storage of PII and in the transfer of PII to others, including
others in third countries. The standard encourages organizations to identify privacy
safeguarding requirements whenever they design an ICT system that will be used to process
PII. It says the privacy risk management process comprises five main elements:
At this point, the standard refers again to PIA, which it describes as that part of risk
management that focuses on ensuring compliance with legislation and assessing the privacy
implications of any new or modified programs. It says that privacy safeguarding requirements
and PIAs should be part of the organization’s risk management framework, and describes
privacy risk management as a process. That process should take into account various factors,
including legal and regulatory, contractual, business, and others. Among the other factors are
the privacy preferences of PII principals. The standard indirectly refers to “privacy by design”
(PbD) when it says that ICT system designers should take into account the likely privacy
preferences of the PII principals. Organizations should respond to the privacy safeguarding
requirements with a set of privacy controls as an outcome of their privacy risk assessment and
treatment. The controls should be embedded in the organization’s approach to PbD and in its
information security management framework. The standard also says that top management
should be involved in the establishment of the organization’s privacy policy. Distinguishing
between an internal and an external privacy policy, the standard says that the organization
should document the controls used to enforce the policy.
It provides a list of privacy principles that were abstracted from those promulgated by various
countries and international organizations. It says the privacy principles are to guide the
design, development and implementation of privacy policies and controls. ISO 27005
formulates 11 privacy principles, as follows:
Consent and choice means the PII principal must have a freely given, specific and
knowledgeable choice (opt-in) about the processing of her PII. A PII principal should be able
withdraw her consent without penalty.
Purpose legitimacy and specification means ensuring that the purpose(s) complies with
applicable law, and communicating the purpose with the PII principal before the organisation
collects the information.
62 | P a g e
Collection limitation means limiting the collection of PII to that which has a legal basis and
to not more than necessary for the specified purpose(s). The standard says organisations should
justify and document the PII they collect.
Data minimisation means minimising the PII processed and the number of people who
have access to such data. Use, retention and disclosure limitation means a limit to that
necessary to fulfil specific, explicit and legitimate purposes, and retaining such data only as
long as necessary to meet the specified purpose.
Accuracy and quality mean that the data controller must ensure that the PII is accurate and
relevant for the specified purpose.
Openness, transparency and notice mean that the data controller should provide PII
principals with clear and easily accessible information about its policies, procedures and
practices in regard to the processing of PII. The data controller should also inform the PII
principals about who might be provided with the PII and whom they can contact at the
controller’s address if they have questions or want to access their data.
Individual participation and access means enabling the PII principals to access, review and
correct their PII, provided their identity is authenticated.
Information security means protecting PII to ensure its integrity, confidentiality and
availability, and protect it against unauthorized access, use or loss.
Privacy compliance means ensuring that the processing meets data protection and privacy
safeguards (legislation and/or regulation), and enabling the conduct of audits. It also
means that the organization should conduct privacy risk assessments to ensure, among other
things, that the organisation complies with laws and regulations and safeguarding
requirements.
63 | P a g e
2 Is the RM methodology regarded as a On privacy safeguarding
process or is it simply about producing a requirements refers to the privacy risk
report? management process. A note also refers to
PIA, which is a process.
3 Does the RM methodology address only This standard is focused on personally
information privacy protection or does it identifiable information (PII).
address other types of privacy as well?
4 Does the RM methodology say that it Not exactly, but it does say that the design
should be undertaken when it is still of any ICT system involving the process of
possible to influence the development of PII should be preceded by an identification
the project? of the relevant privacy safeguarding
requirements.
6 Does the RM methodology call for No, it does not talk about developing a plan
64 | P a g e
12 Does the RM methodology include Yes, it calls protection measures “privacy
provisions for identifying protection safeguarding requirements”.
measures and/or design solutions to avoid
or to mitigate any negative impacts of the
project or, when negative impacts are
unavoidable, does it require justification
of the business need for them?
13 Does the RM methodology include Yes, the organization
provisions for documenting the process? Should document its privacy policy (both
internal and external policies). It also says
privacy controls should be documented. It
says organisations should document the type
of PII collected as well as the justification
for doing so.
14 Does the RM methodology include Not specifically, although it does mention
provision for making the resulting Communicating with stakeholders. Further,
document public (whether redacted or one of the privacy principles focuses on
otherwise)? openness, transparency and notice. There, it
says the organization should provide
stakeholders with clear information about
its PII policies and practices, the purpose
for which it is processing PII, how to
contact the controller, the choices open to
PII principals, access to their data, the
65 | P a g e
Conclusions and recommendations
ISO29100 is not a privacy risk management methodology per se, so it is a bit unfair to assess
it as such. Its primary focus and value is on privacy terminologies and, especially, privacy
principles. In section 5, wherein the privacy principles are identified and discussed, there is
operational guidance, as the foregoing indicates. It has many “touch points” in common with
the ICO PIA Handbook. As it is not, strictly speaking, a risk management methodology or
process, it offers no “open doors” wherein a PIA could be conducted. However, it does refer
to the privacy risk management process (notably in the section dealing with privacy
safeguarding requirements) wherein there are “open doors”, e.g., in regard to establishing the
context, assessing and treating risks, communicating and consulting with stakeholders, and
monitoring and review.
CHAPTER-8
A privacy risk register is a tool that allows you to collate, record, track and manage all your data
protection, information security and privacy risks information in one place. This Overview
guides you through the process of creating a privacy risk register.
In order to formulate an effective privacy risk register, you must first identify the risks your firm
faces. You can do this by completing a risk assessment.
There is no established format for a risk assessment, but it would make sense to consider:
66 | P a g e
You should also consider the findings from any internal audits or investigations into any data
security breaches and sector specific data from the Information Commissioner's Office (ICO) on
the main causes of data protection breaches: see Data security incident trends.
Precedent Data protection risk assessment guides you through the process of assessing your
risks, using the above criteria. For each risk you identify in the risk assessment, you are given the
option to:
Record an action point to address the risk immediately—this would be suitable for simple
risks that are capable of quick and simple resolution, or
Add the risk to your privacy risk register—which you should do for risks that cannot be
addressed quickly
At the conclusion of your Data protection risk assessment, you will have a list of risks to add to
your privacy risk register.
Assigning a score to each risk on your register will help you priorities your privacy risks and
respond accordingly.
So, for each risk you have identified, consider two questions:
1. If that risk materialized, how bad Assign a score to your answer1 = high impact, 2 =
would it be, ie what the impact is? medium impact, 3 = low impact
2. How likely is it that the risk will Assign a score to your answer1 = high probability,
materialize, ie what’s the probability? 2 = medium probability, 3 = low probability
You then multiply the ‘impact’ score by the ‘probability’ score to produce a risk rating (ie final
score) for that particular risk. The higher the score, the more concerned you should be.
67 | P a g e
8.1 Completing the privacy risk register
Precedent: Privacy risk register takes the form of an Excel spreadsheet and consists of three tabs:
A sample Privacy risk register—this has been populated with a number of risks to
demonstrate how the privacy risk register is intended to be used
A blank Privacy risk register, which you can populate with any risks you identify for your
business
Drafting Notes—this explains how to assign a numerical value to each risk, using a 3 x 3
risk matrix
There is no requirement to formulate a data protection risk register, but ICO guidance on privacy
impact assessments (PIAs) suggests this is a good idea.
A data protection risk assessment is not the same as a PIA. The former identifies data protection,
information security and privacy risks across your business. The latter identifies the privacy risks
associated with a discrete project, e.g. to introduce a new HR system.
CHAPTER-9
68 | P a g e
pack" contains information like phone IMEI and regulations that are obligatory to the telecom
industry.
After we find out what is confidential for our costumer, we need to help him or her protect that
data, and then the following question comes to mind:
How can we find our Confidential Data?
There are a few methods that should be considered. These methods are mostly in regard to DLP
capabilities and they will not be correct when approaching another vendor's solution.
1) Consult with the Customer/CISO - we need to be in touch with employees that have a cross
organizational view and approach. Most of the time knowledgeable personnel can tell us where
60%-80% of the confidential data in the organization is stored.
2) Use DLP's network monitoring ability - DLP has the ability to "tap in" to the heart of the
network using a Mirror Port (also known as SPAN Port). The network monitor has the ability to
analyze all of the network traffic. It would give a good indication of the knowledge running on
our network, the type of transformation method (Instant Messaging, mail, file copying and more)
and the destination of the data. When we receive big amounts of data, we can create rules and
policies with this data. When implementing DLP in the organization, it is suggested to install
Network Monitor in order to study the network. The amount of learning time needed is different
between each client and is defined by the network bandwidth in use.
3) Use DLP Network Discover/Protect - DLP has the ability to scan a verity of components
(Data Bases, SharePoint's, Storage, File Servers, Endpoint Clients and more) in order to find
confidential data that is laying around on the corporate network.
Security breaches rocked 2015. Sensitive data from high-profile organizations ranging from
banks and multinational conglomerates to illicit online dating services fell into the hands of
hackers, affecting millions of customers and employees. It was a terrible year for data privacy
and security, and a wake-up call for chief technology officers and corporate legal departments
everywhere.
Data volume has been growing exponentially, dramatically increasing opportunities for theft and
accidental disclosure of sensitive information. In the past, the amount of data doubled every four
years. According to technology research firm IDC, it now doubles every two years, and by 2020
the digital universe — the data we create and copy annually will reach 44 zettabytes, or 44
trillion gigabytes. These facts, along with increases in the portability of data, employee mobility
and penalties for failing to comply with strict data protection regulations raise the question:
“What more can organizations do to protect themselves and their stakeholders?” An integral part
of the answer may be data loss prevention (DLP).
DLP identifies, monitors and protects data in use, data in motion on your network, and data at
rest in your data storage area or on desktops, laptops, mobile phones or tablets. Through deep
69 | P a g e
content inspection and a contextual security analysis of transactions, DLP systems act as
enforcers of data security policies. They provide a centralized management framework designed
to detect and prevent the unauthorized use and transmission of your confidential information.
DLP protects against mistakes that lead to data leaks and intentional misuse by insiders, as well
as external attacks on your information infrastructure.
In the wake of recent security events, interest in the technology has exploded. In its “Forecast
Overview: Information Security, Worldwide, 3Q15 Update” report, Gartner predicted that DLP
will be among the fastest-growing security segments through 2019, with a combined annual
growth rate of nearly 10 percent.
The loss of sensitive data and other forms of enterprise information can lead to significant
financial losses and reputational damage. While companies are now well-aware of these dangers
and data protection has become a hot topic, many organizations aren’t very familiar with
content-aware technologies, and don’t fully understand the business case for DLP initiatives.
With this context in mind, we have outlined 10 reasons your organization needs data loss
prevention.
1. You aren’t sure where your company’s confidential data is being stored, where it’s being sent
and who is accessing it.
DLP technology provides IT and security staff with a 360-degree view of the location, flow and
usage of data across the enterprise. It checks network actions against your organization’s security
policies, and allows you to protect and control sensitive data, including customer information,
personally identifiable information (PII), financial data and intellectual property. With a
thorough understanding of this data, your organization can set the appropriate policies to protect
it, and make risk-prioritized decisions about what assets need to be protected and at what cost.
2. Your company has a plan for protecting data from external intruders, but does not protect
against theft and accidental disclosure of sensitive information by employees and partners.
Not all data loss is the result of external, malicious attacks. The inadvertent disclosure or
mishandling of confidential data by internal employees is a significant factor. DLP can detect
files that contain confidential information and prevent them from leaving via the network. It can
block sensitive data transfers to Universal Serial Bus (USB) drives and other removable media.
DLP also offers the ability to apply policies that safeguard data on a case-by-case basis. For
example, if a security event is detected, access to a specific workstation can be blocked instantly.
Policies can also quarantine or encrypt data in real-time response to events.
3. You are concerned about the liability, negative exposure, fines and lost revenue associated
with data breaches.
Data breaches have been making headlines with alarming frequency. They can wreak havoc on
an organization’s bottom line through fines, bad publicity, loss of strategic customers and legal
action. According to PWC’s 2015 Global State of Information Security Survey, organizations
70 | P a g e
reported 2014 financial losses stemming from security incidents that were 93 percent higher than
2013. In fact, the number of global incidents is growing faster than the number of global
smartphone users and the global GDP combined!
4. You are concerned about your next audit and want to maintain compliance with complex
regulations.
More than 50 countries have enacted data protection laws that require organizations in both the
public and private sectors to safeguard sensitive information. Penalties for noncompliance with
strict privacy regulations and breach notification laws continue to grow. Requirements reach
beyond the simple provision of written policies to prove compliance. Technology controls are
becoming necessary to achieve compliance in certain areas. DLP provides these controls, as well
as policy templates and maps that address specific requirements, automate compliance, and
enable the collection and reporting of metrics.
5. You need to protect proprietary information against security threats caused by enhanced
employee mobility and new communication channels.
Many employees are turning to social networking, instant messaging and other Web 2.0
applications to keep up with consumer trends. DLP helps to prevent the accidental exposure of
confidential information across these unsecure lines of communication while at the same time
keeping them open for appropriate uses. With the proliferation of mobile devices and employees
working remotely, corporate data increasingly resides both in and outside of the organization.
Wherever data lives in transit on the network, at rest in storage, or in use on a laptop or
smartphone, DLP can monitor it and significantly reduce the risk of data loss.
6. You would like to monitor your organization for inappropriate employee conduct and maintain
forensic data of security events as evidence.
Insiders represent a significant risk to data security. An employee who emails a work-related
document to his personal account in order to work over the weekend may have good intentions.
However, he or she poses a tremendous threat when there is confidential data involved. DLP
technology offers 360-degree monitoring that includes email (both corporate accounts and
webmail), instant messages, keystrokes typed, documents accessed and software applications
used. It also allows you to capture and archive evidence of incidents for forensic analysis. With
DLP, you can limit and filter Web surfing, and control which applications employees can access.
It is an invaluable tool in the effort to stop dangerous or time-wasting activities, and helps to
detect problems before they can damage your business.
71 | P a g e
7. You are uncertain of your organization’s level of protection for confidential data in cloud
applications and storage.
Large amounts of data are being moved to applications in the cloud—an environment in which it
is not apparent where data will be physically stored and processed. Protecting sensitive
information in virtual and cloud models is critical. DLP recognizes confidential data and
automates its encryption at rest, in motion and in use, preventing its transmission to third-party
infrastructures.
8. Your organization would like to proactively prevent the misuse of data at endpoints, both on
and off the corporate network.
9. You would like to automate corporate governance as a means of improving compliance while
saving time and resources.
DLP capabilities for the enforcement and automation of corporate policies and processes can
help improve technical and organizational efficiencies, promote compliance, and provide
methods for more comprehensive information governance. DLP provides up-to date policy
templates and maps that address specific requirements, automate compliance, and enable the
collection and reporting of metrics. When a policy need is identified, DLP can make the change
as simple as enabling an appropriate policy template on your system.
10. You would like to gain a competitive advantage, in both brand value and reputation.
When organizations fail to take the necessary steps to identify sensitive data and protect it from
loss or misuse, they are risking their ability to compete. Whether it’s a targeted attack or an
inadvertent mistake, confidential data loss can diminish a company’s brand, reduce shareholder
value, and irreparably damage the company’s reputation. DLP facilitates the protection of
valuable trade secrets and other vital intelligence, and helps to prevent the negative publicity and
loss of customers that inevitably follow data breaches.
When all these threats and data theft is covered in the products like Data loss prevention,
customer satisfaction becomes integral part of the whole process of innovation, which leads them
to delight. Need of product is the need to end their worries. If the worries and anxieties are taken
care by the product based companies based on the current situation and data threat analysis.
Customer will think of purchasing other product from the same vendor and may also request to
integrate them in their environment. We all know that the word of mouth spreads faster and in IT
72 | P a g e
security industry it spreads like fire and different organizations will come together to have these
products on board.
Many are surprised by how many of these 10 reasons apply to their business. Many organizations
don’t fully understand the benefits DLP offers. Developing a comprehensive data loss prevention
strategy shouldn’t be an afterthought. When properly deployed, DLP can transform sensitive
data into an operational asset, and help to prevent your organization from making the
wrong kind of headlines.
73 | P a g e
Data Strategies
Every organization fears losing its critical, confidential, highly restricted or restricted data. Fear
of losing data amplifies for an organization if their critical data is hosted outside their premises,
say onto a cloud model. To address this fear or issue that organizations face, a security concept
known as “Data Loss Prevention” has evolved, and it comes in product flavors in the market.
The most famous among them are Symantec, McAfee, Web-sense, etc. Each DLP product is
designed to detect and prevent data from being leaked. These products are applied to prevent all
channels through which data can be leaked.
IT strategy for Data loss prevention product is to penetrate it in to small scale companies as well.
This way it becomes compliance to have the Data loss prevention in the company’s network for
data integrity and confidentiality. It requires strong IT leadership like Chief information officer
and chief technology officer to work closely with business, budget and legal departments as well
as other groups in the organization to have the plan of streamlined budget for all the companies
who wish to buy the products or customize the product based on the needs, so that maximum
people will take benefit of such technology.
There is strategy of integrating the Data loss prevention with other vendor products
Data loss prevention is the way to secure your valuable data .Data has been increasing in terms
complexity and confidentiality, more advance technologies are embedded in to the product to
maximize its utilization for the industry who value their data, and want to improve on finding the
data breach in current environment. Data loss prevention is also being introduced in the country
were governing laws for data privacy are robust and strict. They are also participating in this
quest globally now.
DLP products come with inbuilt policies that are already compliant with compliance standards
like PCI, HIPPA, SOX, etc. Organizations just need to tune these policies with their
organizational footprint. But the most important thing in DLP strategy is to identify the data to
protect, because if an organization simply puts DLP across the whole organization, then a large
number of false positives will result. The below section covers the data classification exercise.
The first thing every organization should do is to identify all the confidential, restricted, and
highly restricted data across the whole organization and across the three channels, i.e. for data in-
transit, in-store and in-use. DLP products work with signatures to identify any restricted data
when it is crossing boundaries. To identify the critical data and develop its signatures, there is a
term in DLP products known as fingerprinting. Data is stored in various forms at various
locations in an organization and it requires identifying and fingerprinting. Various products
comes with a discovery engine which crawl all the data, index it and made it accessible through
an intuitive interface which allows quick searching on data to find its sensitivity and ownership
details.
74 | P a g e
Strengths
Loyal customers
Market share leadership
Discovers confidential data
Data Safety.
Data analysis.
Available for all platforms like Windows, Mac and Linux machines.
End user – Fortune 500 companies.
Weaknesses
Opportunities
Threats
Competition
Product substitution
Similar Softwares and free
75 | P a g e
Many threats, especially from a security perspective, are fairly easy to delineate. If the
organization is subject regulations such as PCI DSS, HIPAA or SOX, the cost of non-compliance
can be astronomical. The costs of reputational damage often far outweigh the fines for non-
compliance. And the fines for non-compliance are stiff.
Data loss prevention is setting up the benchmark for customer satisfaction in terms of their
needs. Needs are basically evaluated by the security auditing team and the Governing bodies
which have enough of information on data and data confidentiality. There are financial,
Educational, Hospitality, Manufacturing, Health care, Insurance, Media & entertainment
,Pharmaceuticals , Retail , Telecom etc. solution pack for the respective industry. These solution
pack contains the policies, roles, reports, protocols and incidents statues that support a particular
industry or organization based on the lines of business they are doing.
Customer satisfaction is measured based on the needs of end user or organizational needs to
safeguard their confidential data.
Data classification, in the context of information security, is the classification of data based on its
level of sensitivity and the impact to the University should that data be disclosed, altered or
destroyed without authorization. The classification of data helps determine what baseline
security controls are appropriate for safeguarding that data. All institutional data should be
classified into one of three sensitivity levels, or classifications:
A. Restricted Data
B. Private Data
C. Public Data
76 | P a g e
Data should be classified as Public when the unauthorized disclosure, alteration or
destruction of that data would results in little or no risk to the University and its affiliates.
Examples of Public data include press releases, course information and research
publications. While little or no controls are required to protect the confidentiality of Public
data, some level of control is required to prevent unauthorized modification or destruction
of Public data.
Classification of data should be performed by an appropriate Data Steward. Data Stewards are
senior-level employees of the University who oversee the lifecycle of one or more sets of
Institutional Data.
Data Collections
Data Stewards may wish to assign a single classification to a collection of data that is common in
purpose or function. When classifying a collection of data, the most restrictive classification of
any of the individual data elements should be used. For example, if a data collection consists of a
student's name, address and social security number, the data collection should be classified as
Restricted even though the student's name and address may be considered Public information.
Reclassification
On a periodic basis, it is important to reevaluate the classification of Institutional Data to ensure
the assigned classification is still appropriate based on changes to legal and contractual
obligations as well as changes in the use of the data or its value to the University. This evaluation
should be conducted by the appropriate Data Steward. Conducting an evaluation on an annual
basis is encouraged; however, the Data Steward should determine what frequency is most
appropriate based on available resources. If a Data Steward determines that the classification of a
certain data set has changed, an analysis of security controls should be performed to determine
whether existing controls are consistent with the new classification. If gaps are found in existing
security controls, they should be corrected in a timely manner, commensurate with the level of
risk presented by the gaps.
77 | P a g e
Calculating Classification
The goal of information security, as stated in the University's Information Security Policy, is to
protect the confidentiality, integrity and availability of Institutional Data. Data classification
reflects the level of impact to the University if confidentiality, integrity or availability is
compromised.
Unfortunately there is no perfect quantitative system for calculating the classification of a
particular data element. In some situations, the appropriate classification may be more obvious,
such as when federal laws require the University to protect certain types of data (e.g. personally
identifiable information). If the appropriate classification is not inherently obvious, consider each
security objective using the following table as a guide. It is an excerpt from Federal Information
Processing Standards (FIPS) publication 199 published by the National Institute of Standards and
Technology, which discusses the categorization of information and information systems.
POTENTIAL IMPACT
78 | P a g e
Availability The disruption of The disruption of The disruption of
Ensuring timely and access to or use of access to or use of access to or use of
reliable access to and information or an information or an information or an
use of information. information system information system information system
could be expected to could be expected to could be expected to
have a limited have a serious have a severe or
adverse effect on adverse effect on catastrophic adverse
organizational organizational effect on
operations, operations, organizational
organizational assets, organizational assets, operations,
or individuals. or individuals. organizational assets,
or individuals.
As the total potential impact to the University increases from Low to High, the classification of
data should become more restrictive moving from Public to Restricted. If an appropriate
classification is still unclear after considering these points, contact the Information Security
Office for assistance.
Predefined Types of Restricted Information
The Information Security Office and the Office of General Counsel have defined several types of
restricted data based on state and federal regulatory requirements. They're defined as follows:
1. Authentication Verifier
o Passwords
o Shared secrets
o Cryptographic private keys
79 | P a g e
See the University's Gramm-Leach-Bliley Information Security Program.
o Electronic storage media includes computer hard drives and any removable and/or
transportable digital memory medium, such as magnetic tape or disk, optical disk,
or digital memory card.
o Transmission media used to exchange information already in electronic storage
media. Transmission media includes, for example, the Internet, an extranet (using
Internet technology to link a business with information accessible only to
collaborating parties), leased lines, dial-up lines, private networks and the physical
movement of removable and/or transportable electronic storage media. Certain
transmissions, including of paper, via facsimile, and of voice, via telephone, are
not considered to be transmissions via electronic media because the information
being exchanged did not exists in electronic form before the transmission.
Export Controlled Materials is defined as any information or materials that are subject to
United States export control regulations including, but not limited to, the Export
Administration Regulations (EAR) published by the U.S. Department of Commerce and
the International Traffic in Arms Regulations (ITAR) published by the U.S. Department of
State. See the Office of Research Integrity and Compliance's FAQ on Export Control for
more information.
FTI is defined as any return, return information or taxpayer return information that is
entrusted to the University by the Internal Revenue Services. See Internal Revenue Service
Publication 1075 Exhibit 2 for more information.
80 | P a g e
Payment card information is defined as a credit card number (also referred to as a primary
account number or PAN) in combination with one or more of the following data elements:
o Cardholder name
o Service code
o Expiration date
o CVC2, CVV2 or CID value
o PIN or PIN block
o Contents of a credit card’s magnetic stripe
Payment Card Information is also governed by the University's PCI DSS Policy and
Guidelines (login required).
Personally Identifiable Education Records are defined as any Education Records that
contain one or more of the following personal identifiers:
For the purpose of meeting security breach notification requirements, PII is defined as a
person’s first name or first initial and last name in combination with one or more of the
following data elements:
81 | P a g e
o Social security number
o State-issued driver’s license number
o State-issued identification card number
o Financial account number in combination with a security code, access code or
password that would permit access to the account
o Medical and/or health insurance information
o Name
o Address (all geographic subdivisions smaller than state including street address,
city, county, precinct or zip code)
o All elements of dates (except year) related to an individual including birth date,
admissions date, discharge date, date of death and exact age if over 89)
o Telephone numbers
o Fax numbers
o Electronic mail addresses
o Social security numbers
o Medical record numbers
o Health plan beneficiary numbers
o Account numbers
o Certificate/license numbers
o Vehicle identifiers and serial numbers, including license plate number
o Device identifiers and serial numbers
o Universal Resource Locators (URLs)
82 | P a g e
o Internet protocol (IP) addresses
o Biometric identifiers, including finger and voice prints
o Full face photographic images and any comparable images
o Any other unique identifying number, characteristic or code that could identify an
individual
Per Carnegie Mellon’s HIPAA Policy, PHI does not include education records or
treatment records covered by the Family Educational Rights and Privacy Act or
employment records held by the University in its role as an employer.
Individuals, as citizens and consumers need to have the means to exercise their right to
privacy and protect themselves and their information from abuse. This is particularly the case
when it comes to our personal information. Data protection is about safeguarding our
fundamental right to privacy, which is enshrined in international and regional laws and
conventions.
Data protection is commonly defined as the law designed to protect your personal information,
which is collected, processed and stored by “automated” means or intended to be part of a filing
system. In modern societies, to empower us to control our information and to protect us from
abuses, it is essential that data protection laws restrain and shape the activities of companies and
governments. These institutions have shown repeatedly that unless rules restrict their actions,
they will endeavor to collect it all, mine it all, and keep it all, while telling us nothing at all.
83 | P a g e
through strong data protection practices, with effective legislation to help minimize needless
monitoring by officialdom and regulate surveillance by companies.
Since the 1960s and the expansion of information technology capabilities, business and
government organisations have been storing this personal information in databases. Databases
can be searched, edited, cross-referenced and data shared with other organisations and across the
world. Once the collection and processing of data became widespread, people started asking
questions about was happening to their information once it was turned over. Who had the right to
access the information? Was it kept accurately? Was it being collected and disseminated without
their knowledge? Could it be used to discriminate or abuse other fundamental rights?
From all this, and growing public concern, data protection principles were devised through
numerous national and international consultations. The German region of Hesse passed the first
law in 1970, while the US Fair Credit Reporting Act 1970 also contained some elements of data
protection. The US led development of the 'fair information practices' in the early 1970s that
continue to shape data protection law today. The UK also established a committee around the
same time to review threats by private companies and came to similar conclusions.
National laws emerged soon afterwards, beginning with Sweden, the US, Germany and France.
Further momentum was added in 1980 when the Organisation for Economic Cooperation and
Development (OECD) developed its privacy guidelines that included 'privacy principles', and
shortly thereafter the Council of Europe's convention came into force.
While over 100 countries now have laws, in many countries there is still a great need for stronger
legal safeguards to give citizens and consumers’ confidence in what is done to their personal
information by government and business. Although most countries have accepted data
protection is necessary in selected sectors they have not yet developed comprehensive data
protection law that applies to all business sectors and to government.
Where a comprehensive data protection law exists, organisations, public or private, that collect
and use your personal information have the obligation to handle this data according to the data
protection law. This law is based on a number of basic principles. Briefly, these principles
require that:
there should be limits to what is collected: there should be limits on the collection of
personal information, and it should be obtained by lawful and fair means, with the
knowledge or consent of the individual
there must be no secret purposes: the purposes for which the information is to be used
should be specified at least at the time of collection and should only be used for those
agreed purposes;
84 | P a g e
there must be no creeping purposes: personal information can only be disclosed, used, or
retained for only the original purposes, except with the consent of the individual or under
law, and accordingly it must be deleted when no longer necessary for that purpose;
the information must be secure: reasonable security safeguards are used to protect
personal information from loss, unauthorized access, destruction, use, modification or
disclosure;
Data protection rules need to be enforced by a regulator or authority, often called a Privacy
Commissioner. The strength of the powers invested in these authorities varies from country to
country and so does its independence from Government. These powers, for example, can include
the ability to conduct investigations, act on complaints and impose fines when they discover an
organisation has broken the law.
Apart from enforcement through regulatory means, we also believe that technologies can play a
strong role in ensuring data protection rules are followed. Through technological means and
careful design, it is possible to limit data collection, to mathematically restrict further data
processing, to assuredly limit unnecessary access, amongst other privacy measures.
Laws can influence and when necessary compel such developments. Though their adoption has
been slow, as companies and governments are resistant to limit their future capabilities or
aspirations to mine our information, even as they are legally supposed to limit purpose creep.
9.5 How many countries in the world have data protection laws?
Over 100 countries around the world have enacted comprehensive data protection legislation,
and several other countries are in the process of passing such laws. Other countries may have
privacy laws applying to certain areas, for example for children or financial records, but do not
have a comprehensive law. For instance, while an early leader in the field of data protection, the
US Privacy Act 1974 applies only to the Federal Government, and subsequent laws applies to
specific sectors, but there is no comprehensive law to date.
The strongest and most comprehensive laws are in the countries of the European Union and
European Economic Area that have implemented the 1995 Data Protection Directive. This is
currently undergoing difficult process of revision in Brussels.
85 | P a g e
Canada is another leading example with two separate pieces of legislation applying at the
national level to government and industry, with additional laws at the provincial level as well.
For more information on data protection laws, broken down by country, check out
the comprehensive reports published over the years by Privacy International.
9.6 Are data protection laws the same in all countries that have them?
No, and increasingly this is part of the problem. As our information travels around the world
through borderless networks, our data may end up in countries that have different laws of
varying strength or no law at all, meaning we’d have no remedies if our rights are abused. In
essence, depending on what services you use, different pieces of your data will be in various
countries.
Data protection law has become not only a vehicle for protecting citizens and consumers, it has
become a gateway to trade. Various international conventions and guidelines have been
established in order to ensure that information can circulate around the world without causing too
much damage to ‘data subjects’ and that businesses do not base themselves in countries with the
weakest laws. The OECD Guidelines on the Protection of Privacy, first agreed in 1980 and
revised in 2013, were the pioneer in establishing the data protection principles, adopted by many
countries in their legislation.
A driving motivation for the OECD Guidelines was to enable protection of privacy while
enabling data to flow across borders, and opening up markets.
The international instrument with most teeth however is the Council of Europe 1981
Convention for the Protection of Individuals with regard to the Automatic Processing of Personal
Data. This has the force of law for the countries that have signed up to it. Countries from
outside Europe can sign-up to it, but unfortunately only Uruguay has done so far.
The EU's 1995 Directive standardized laws to some extent across European Union member
states, partly to enable trade within the European market. The Directive required that data could
only be sent to foreign jurisdictions if those countries had adequate laws with protections in
place. One notable exception however is the US which has repeatedly failed to implement a
comprehensive law, and the 1974 Privacy Act only applies to the Federal Government, and only
protects US citizens and residents.
As an attempt at a quick fix, there’s a separate agreement on personal information transfers
between the EU and the US – called the Safe Harbor agreement.
This arrangement has been heavily criticized by both Privacy International and the European
Commission itself, as it is a voluntary and self-regulatory system which is not adequately
implemented and not sufficiently enforced. Though the Obama administration has promised to
extend the Privacy Act to European citizens and has repeatedly mentioned introducing a
comprehensive law, no meaningful action has yet occurred. It is therefore highly problematic
that much of the world's information passes through and exists under the jurisdiction of US law,
where non-Americans have no rights at all.
The EU and Council of Europe are trying to update their instruments to consider new challenges
to privacy, and to strengthen protections. These laws were drafted before the rise of internet
giants and marketing associations with significant lobbying capabilities; and before the rise of
the anti-terrorism policy agenda. As such, government agencies and companies have been
86 | P a g e
working hard to undermine these legal instruments. For instance, over 3000 amendments were
introduced in the European Parliament when the draft General Data Protection Regulation was
being discussed, some of them introduced by members of the European Parliament who
had copied and pasted the amendments from industry lobbyists briefings. The interests in
undermining data protection are stronger than ever.
Personal information means any kind of information (a single piece of information or a set of
information) that can personally identify an individual or single them out as an individual. The
obvious examples are somebody’s name, address, national identification number, date of birth or
a facial image. A few perhaps less obvious examples include vehicle registration plate numbers,
credit card numbers, fingerprints, a computer’s IP address, CCTV video footage, or health
records.
You can be singled out from other people even if your name is not known; for example online
profiling companies assign a unique number and use tracking techniques to follow you around
the net and build a profile of your behaviour and interests in order to present you with
advertisements. Some personal information is considered more sensitive than other, and
therefore subject to stricter rules; this includes your racial or ethnic origin, political views,
religion, health, and sex life. Such information cannot be collected or used at all without your
specific consent.
Ensuring that we comply with the eight data protection principles, as listed below.
Meeting our legal obligations as laid down by the Data Protection Act 1998
Processing personal data only in order to meet our operational needs or fulfil legal
requirements
Ensuring that a nominated officer is responsible for data protection compliance and
provides a point of contact for all data protection issues
87 | P a g e
Ensuring that all staff are made aware of good practice in data protection
Providing adequate training for all staff responsible for personal data
Ensuring that everyone handling personal data knows where to find further guidance
Ensuring that queries about data protection, internal and external to the organisation, is
dealt with effectively and promptly
Regularly reviewing data protection procedures and guidelines within the organisation.
CHAPTER-10
Publically available personal information pose a greater risk for Indians because majority of
population is illiterate and there is no law mandating data protection. Individuals are repeatedly
transmitting their personal information for various activities. Aspects such as the purpose for
collecting personal information, how will this information be used, security mechanisms put in
place for protecting such information , for how long will this information be stores, what will be
the procedure for destroying such information etc are not known by the individual nor have these
aspects been defined uniformly in any law. India’s has no specific legislation focusing on data
protection. A few principles of data protection are scattered through IT Act, Guidelines issued by
RBI, TRAI etc.
Any kind of processing of personal data should be fair and transparent. Providers of personal
information should be made aware of risks, rules, safeguards and rights in relation to the
processing of personal data and how to exercise their rights in relation to such processing.
Particularly, the specific purposes for which personal data is processed should be explicit and
legitimate and determined at the time of the collection of the personal data. Personal data should
be processed in a manner that ensures appropriate security and confidentiality of the personal
data, including for preventing unauthorized access to or use of personal data and the equipment
used for the processing. Basic principles guiding processing of Personal data are as follows:-
Lawfulness, fairness and transparency. There should be a general policy of openness about
developments, practices and policies with respect to personal data. Means should be readily
88 | P a g e
available of establishing the existence and nature of personal data, and the main purposes of their
use, as well as the identity and usual residence of the data controller.
Personal data should be collected for specified, explicit and legitimate purposes and not further
processed in a manner that is incompatible with those purposes. The purposes for which personal
data are collected should be specified not later than at the time of data collection and the
subsequent use limited to the fulfilment of those purposes or such others as are not incompatible
with those purposes and as are specified on each occasion of change of purpose.
Collection of Personal Data should be adequate, relevant and limited to what is necessary in
relation to the purposes for which they are processed. This is also known as the principle of Data
minimization. There should be limits to the collection of personal data and any such data should
be obtained by lawful and fair means and, where appropriate, with the knowledge or consent of
the data subject.
The agency collecting personal data should ensure accuracy of data-delete/rectify inaccurate
data. Data Quality Principle entails that personal data should be relevant to the purposes for
which they are to be used, and, to the extent necessary for those purposes, should be accurate,
complete and kept up-to-date
Personal data should be kept in a form which permits identification of data subjects for no longer
than is necessary for the purposes for which the personal data are processed; personal data may
be stored for longer periods insofar as the personal data will be processed solely for archiving
purposes in the public interest, scientific or historical research purposes or statistical purposes
Personal data should be processed in a manner that ensures appropriate security of such data,
including protection against unauthorized or unlawful processing and against accidental loss,
destruction or damage, using appropriate technical or organisational measures. This principle of
integrity and confidentiality entails that personal data should not be disclosed, made available or
otherwise used for purposes other than those specified except:
As per the Security Safeguards Principle, personal data should be protected by reasonable
security safeguards against such risks as loss or unauthorized access, destruction, use,
modification or disclosure of data.
A data controller should be accountable for complying with measures which give effect to the
principles stated above.
Data protection rules should be applicable to all entities and persons handling personal data –
both private and public sector bodies. The There is no rationale as to why principles such as
openness, purpose limitation, use limitation, etc. should not be applicable to public bodies
generally. Certain specialized functions such as those related to crime and investigation, national
security, taxation should be exempted from the general obligations and should be subject to
specific rules.
89 | P a g e
It can be seen from above that protection of personal data and Right to privacy are intrinsically
linked. Only a strong emphasis on the right to privacy can ensure that personal data is not shared
or leaked incessantly without any checks. It is duty of the State to ensure individual autonomy.
However, in recent times the very concept of the individual autonomy is also at risk. Right to
privacy has its roots in the law of tort under which any unlawful invasion of privacy gave a cause
of action for damages. The right to privacy has two aspects involved
(1) Unlawful invasion to privacy affords a tort action for damages resulting from an unlawful
invasion of privacy and
(2) The constitutional recognition given to the right to privacy which protects personal privacy
against unlawful governmental invasion. The first aspect of this right must be said to have been
violated where, for example, a person’s name or likeness is used, without his consent, for
advertising or non-advertising purposes or for that matter, his life story is written whether
laudatory or otherwise and published without his consent as explained hereinafter.
In recent times, however, this right has acquired a constitutional status. It is not enumerated as a
fundamental right but has been read into Article 21. The Indian courts have to be thanked for the
right to privacy’s development and evolution through the years. The first decision on right to
privacy was Kharak Singh v. State of U.P. Since then the concept has evolved with every
invasion to privacy.
One of the most important piece of legislation protecting our data at present is the Information
Technology Act (hereinafter IT Act). The IT Act makes hacking and tampering with computer
source an offence and penalizes unlawful access to data. However does not prescribe any
minimum security
Standards which the entities having control of data should comply with except in cases of
Personal sensitive information. The Information Technology
(Reasonable security practices and procedures and sensitive personal data or information) Rules,
2011 defines personal sensitive information as Sensitive personal data or information of a person
means such personal information which consists of information relating to;—
(i) password;
(ii) financial
Information such as Bank account or credit card or debit card or other payment instrument
details;
Maintaining of data bases is not as much difficult task as maintaining its integrity, so in this era
the most concerned debate is going on to innovate a perfect method of data protection. With the
advancement in technological development, there took place a transition in the standard of
crimes. In the present era most of the crimes are being done by the professionals through the
easiest medium i.e. computers and electronic gadgets. Just by the single click, the criminals are
90 | P a g e
able to get the secured information. The lust of information is acting as a catalyst in the growth
of cybercrimes.
It is the very big headache for the business houses, financial institutions and the governmental
bodies so as to give adequate protection to their huge databases. In the absence of any particular
stringent law relating to data protection, the miscreants are gaining expertise in their work day by
day.
Though this world simplified our life style but it left certain anomalies in procurement of its
object which resulted in involuntary disclosure of data. This can be analyzed from theses
illustrations:
1. On every login to the e-mail account in the cyber cafes, the electronic trail of password
remained left there unsecured.
2. On every use of credit card for purchasing purpose, the trail of brand preference, place of
shopping etc. left behind.
3. On every login to internet, there left behind an electronic trail enabling website owners and
advertising companies to get access to the preference and choices of the users by tracking them.
4. Employees are under seizing, as employers routinely use software to access employee’s e-mail
and their move.
5. Phone call signals of the police are easily tracked by the Naxalites enabling them to know
about the police plans.
7. Unsolicited e-mails are also a usual practice of gathering personal information of the users.
8. Movement across the web can be tracked by placing cookies and then retrieving such a way
that allows building detailed profile of the user’s interest, spending habits and lifestyle.
Thus it can be easily pointed out that how easy we are providing room to the miscreants to
enhance and simplify their acts and how safe is it to avail the services of the digital world.
91 | P a g e
Data protection under Indian Law
Our constitution has provided the law relating to privacy under the scope of Article 21. Its
interpretation is found insufficient to provide adequate protection to the data. In the year 2000,
effort has been made by our legislature to embrace privacy issues relating to computer system
under the purview of IT Act, 2000. This Act contains certain provisions which provide
protection of stored data. In the year 2006, our legislature has also introduced a bill known as
‘The Personal Data Protection Bill’ so as to provide protection to the personal information of the
person.
Section 43
This section provides protection against unauthorized access of the computer system by
imposing heavy penalty up to one crore. The unauthorized downloading, extraction and copying
of data are also covered under the same penalty. Clause ‘c’ of this section imposes penalty for
unauthorized introduction of computer viruses of contaminants. Clause ‘g’ provides penalties for
assisting the unauthorized access.
Section 65
This section provides for computer source code. If anyone knowingly of intentionally conceals,
destroys, alters or causes another to do as such shall have to suffer a penalty of imprisonment or
fine up to 2 lakh rupees. Thus protection has been provided against tampering of computer
source documents.
Section 66
Protection against hacking has been provided under this section. As per this section hacking is
defined as any act with an intention to cause wrongful loss or damage to any person or with the
knowledge that wrongful loss of damage will be caused to any person and information residing
in a computer resource must be either destroyed, deleted, altered or its value and utility get
diminished. This section imposes the penalty of imprisonment of three years or fine up to two
lakh rupees or both on the hacker.
92 | P a g e
Section 70
This section provides protection to the data stored in the protected system. Protected systems are
those computers, computer system or computer network to which the appropriate government, by
issuing gazette information in the official gazette, declared it as a protected system. Any access
or attempt to secure access of that system in contravention of the provision of this section will
make the person accessed liable for punishment of imprisonment which may extend to ten years
and shall also be liable to fine.
Section 72
This section provides protection against breach of confidentiality and privacy of the data. As per
this, any person upon whom powers have been conferred under IT Act and allied rules to secure
access to any electronic record, book, register, correspondence, information document of other
material discloses it to any other person, shall be punished with imprisonment which may extend
to two years or with fine which may extend to one lakh rupees or both.
Law of contract
These days’ companies are relying on the contract law as a useful means to protect their
information. The corporate houses enters into several agreements with other companies, clients,
agencies or partners to keep their information secured to the extent they want to secure it.
Agreements such as ‘non circumvention and non-disclosure’ agreements, ‘user license’
agreements, ‘referral partner’ agreements etc. are entered into by them which contains
confidentiality and privacy clauses and also arbitration clauses for the purpose of resolving the
dispute if arises. These agreements help them in smooth running of business. BPO companies
have implemented processes like BS 7799 and the ISO 17799 standards of information security
management, which restrict the quantity of data that can be made available to employees of BPO
and call centers.
Upon the footprints of the foreign laws, this bill has been introduced in the Rajya Sabha on
December 8th 2006. The purpose of this bill is to provide protection of personal data and
information of an individual collected for a particular purpose by one organization, and to
prevent its usage by other organization for commercial or other purposes and entitle the
individual to claim compensation or damages due to disclosure of personal
93 | P a g e
Many countries other than India have their data protection laws as a separate discipline. They
have well framed and established laws, exclusively for the data protection.
U.K Law
U.K. parliament framed its Data Protection Act (DPA) in the year 1984 which thereafter repealed
by the DPA of 1998. This Act is basically instituted for the purpose of providing protection and
privacy of the personal data of the individuals in UK. The Act covers data which can be used to
identify a living person. This includes names, birthday, anniversary dates, addresses, telephone
numbers, fax numbers, e-mail addresses etc. It applies only to the data which is held or intended
to be held, on computers or other equipment’s operating automatically in response to instructions
given for that purpose or held in a relevant filing system.
As per the Act, the persons and organizations which store personal data must register with the
information commissioner, which has been appointed as the government official to oversee the
Act. The Act put restrictions on collection of data. Personal data can be obtained only for one or
more specified and lawful purposes, and shall not be further processed in any manner
incompatible with that purpose or purposes. The personal data shall be adequate, relevant, and
not excessive in relation to the purpose or purposes for which they are processed.
U.S Law
Though both U.S and the European Union focus on enhancing privacy protection of their
citizens, U.S takes a different approach to privacy from that of the European Union. US adopted
the sectoral approach that relies of mix of legislation, regulation, and self-regulation. In U.S, data
are grouped into several classes on the basis of their utility and importance. Thereafter,
accordingly a different degree of protection is awarded to the different classes of data.
Several Acts were also passed in order to stabilize the data protection laws in the United States.
The privacy Act was passed in the year 1974 which provided for establishing standards for when
it is reasonable, ethical and justifiable for government agencies to compare data in different
databases. Another Electronic Communications Privacy Act was passed for restricting the
interception of electronic communications and prohibiting the access to stored data without the
consent of the user or the communication service.
Further the Children's Online Privacy Protection Act was passed by the US Congress in October
1998 requiring website operators to obtain parental consent before obtaining personal
information from children, and a Consumer Internet Privacy Protection Act required an ISP to
get permission of the subscriber before disclosing his personal information to third parties.
However, the existing federal laws is not suffice to cover the broad range of issues and
circumstances that make the new digital environment a threat to personal privacy. Furthermore,
the US Government has been reluctant to impose a regulatory burden on Electronic Commerce
activities that could hamper its development and has looked for an answer in self-regulation.
Data or information of any individual without his consent and for matters connected with the Act
or incidental to the Act. Provisions contained in this Act are relating to nature of data to be
obtained for the specific purpose and the quantum of data to be obtained for that purpose. Data
controllers have been proposed to be appointed to look upon the matters relating to violation of
94 | P a g e
the proposed Act.
Conclusion
On comparing the Indian law with the law of developed countries the proper requirement for the
Indian law can be analyzed. Data are not of same utility and importance; it varies from one
another on the basis of utility. So we require framing separate categories of data having different
utility values, as the U.S have. Moreover the provisions of IT Act deal basically with extraction
of data, destruction of data, etc. Companies cannot get full protection of data through that which
ultimately forced them to enter into separate private contracts to keep their data secured. These
contracts have the same enforceability as the general contract.
Despite the efforts being made for having a data protection law as a separate discipline, our
legislature have left some lacuna in framing the bill of 2006. The bill has been drafted wholly on
the structure of the UK Data Protection Act whereas today’s requirement is of a comprehensive
Act. Thus it can be suggested that a compiled drafting on the basis of US laws relating to data
protection would be more favourable to the today’ requirement.
Being one of the most concerned topics of discussion in the modern era, legislatures are required
to frame more stringent and comprehensive law for the protection of data which requires a
qualitative effort rather than quantitative.
CHAPTER-11
95 | P a g e
will also have to appoint a representative in the EU.
Penalties
Under GDPR organizations in breach of GDPR can be fined up to 4% of annual global
turnover or €20 Million (whichever is greater). This is the maximum fine that can be imposed for
the most serious infringements e.g. Not having sufficient customer consent to process data or
violating the core of Privacy by Design concepts. There is a tiered approach to fines e.g. a
company can be fined 2% for not having their records in order (article 28), not notifying the
supervising authority and data subject about a breach or not conducting impact assessment. It is
important to note that these rules apply to both controllers and processors -- meaning 'clouds' will
not be exempt from GDPR enforcement.
Consent
The conditions for consent have been strengthened, and companies will no longer be able to
use long illegible terms and conditions full of legalese, as the request for consent must be given
in an intelligible and easily accessible form, with the purpose for data processing attached to
that consent. Consent must be clear and distinguishable from other matters and provided in an
intelligible and easily accessible form, using clear and plain language. It must be as easy to
withdraw consent as it is to give it.
Breach Notification
Under the GDPR, breach notification will become mandatory in all member states where a data
breach is likely to “result in a risk for the rights and freedoms of individuals”.
96 | P a g e
This must be done within 72 hours of first having become aware of the breach. Data processors
will also be required to notify their customers, the controllers, “without undue delay” after first
becoming aware of a data breach.
Right to Access
Part of the expanded rights of data subjects outlined by the GDPR is the right for data subjects to
obtain from the data controller confirmation as to whether or not personal data concerning them
is being processed, where and for what purpose. Further, the controller shall provide a copy of
the personal data, free of charge, in an electronic format. This change is a dramatic shift to data
transparency and empowerment of data subjects.
Right to be Forgotten
Also known as Data Erasure, the right to be forgotten entitles the data subject to have the data
controller erase his/her personal data, cease further dissemination of the data, and potentially
have third parties halt processing of the data. The conditions for erasure, as outlined in article 17,
include the data no longer being relevant to original purposes for processing, or a data subjects
withdrawing consent. It should also be noted that this right requires controllers to compare the
subjects' rights to "the public interest in the availability of the data" when considering such
requests.
Data Portability
GDPR introduces data portability - the right for a data subject to receive the personal data
concerning them, which they have previously provided in a 'commonly use and machine
readable format' and have the right to transmit that data to another controller.
Privacy by Design
Privacy by design as a concept has existed for years now, but it is only just becoming part of a
legal requirement with the GDPR. At its core, privacy by design calls for the inclusion of data
protection from the onset of the designing of systems, rather than an addition. More specifically -
'The controller shall implement appropriate technical and organisational measures in an
effective way. In order to meet the requirements of this Regulation and protect the rights of data
subjects'. Article 23 calls for controllers to hold and process only the data absolutely necessary
for the completion of its duties (data minimization), as well as limiting the access to personal
data to those needing to act out the processing.
97 | P a g e
Must be appointed on the basis of professional qualities and, in particular, expert
knowledge on data protection law and practices
May be a staff member or an external service provider
Contact details must be provided to the relevant DPA
Must be provided with appropriate resources to carry out their tasks and maintain their
expert knowledge
Must report directly to the highest level of management
Must not carry out any other tasks that could results in a conflict of interest.
Previous Legislation
1995 – October 24th, Data Protection Directive 95/46/EC created to regulate the
processing of personal data
Legislative Proposals
2012 – January 25th, initial proposal for updated data protection regulation by the
European Commission
2014 – March 12th, the European Parliament approved its own version of the regulation
in its first reading
2015 – June 15th, the Council of the European Union approved its version in its first
reading, known as the general approach, allowing the regulation to pass into the final
stage of legislation known as the “Trilogue”
Trilogue Timeline
98 | P a g e
o Data subject rights (Chapter III)
o Controller and Processor (Chapter IV)
2015 – December 15th, the Parliament and Council have come to an agreement, and the
text will be final as of the Official signing to take place in early January of 2016.
2016 - January
99 | P a g e
o April 8th - Adopted by the Council of the European Union
Enforcement
2018 - May - Following a 2 year post-adoption grace period, the GDPR will become fully
enforceable throughout the European Union.
The GDPR was approved and adopted by the EU Parliament in April 2016. The regulation will
take effect after a two-year transition period and, unlike a Directive it does not require any
enabling legislation to be passed by government; meaning it will be in force May 2018.
In light of a uncertain 'Brexit' - I represent a data controller in the UK and want to know if
I should still continue with GDPR planning and preparation?
If you process data about individuals in the context of selling goods or services to citizens in
other EU countries then you will need to comply with the GDPR, irrespective as to whether or
not you the UK retains the GDPR post-Brexit. If your activities are limited to the UK, then the
position (after the initial exit period) is much less clear. The UK Government has indicated it
will implement an equivalent or alternative legal mechanisms. Our expectation is that any such
legislation will largely follow the GDPR, given the support previously provided to the GDPR by
the ICO and UK Government as an effective privacy standard, together with the fact that the
GDPR provides a clear baseline against which UK business can seek continued access to the EU
digital market. (Ref: http://www.lexology.com/library/detail.aspx?g=07a6d19f-19ae-4648-9f69-
44ea289726a0)
100 | P a g e
indirectly identify the person. It can be anything from a name, a photo, an email address, bank
details, and posts on social networking websites, medical information, or a computer IP address.
Do data processors need 'explicit' or 'unambiguous' data subject consent - and what is the
difference?
The conditions for consent have been strengthened, as companies will no longer be able to utilise
long illegible terms and conditions full of legalese, as the request for consent must be given in an
intelligible and easily accessible form, with the purpose for data processing attached to
that consent - meaning it must be unambiguous. Consent must be clear and distinguishable from
other matters and provided in an intelligible and easily accessible form, using clear and plain
language. It must be as easy to withdraw consent as it is to give it. Explicit consent is
required only for processing sensitive personal data - in this context, nothing short of “opt in”
will suffice. However, for non-sensitive data, “unambiguous” consent will suffice.
Parental consent will be required to process the personal data of children under the age of 16 for
online services; member states may legislate for a lower age of consent but this will not be below
the age of 13.
101 | P a g e
Current regulations & guidance
In the UK we have:
The Data Protection Act 1998 •
Privacy and Electronic Communications (EC Directive) Regulations 2003
ICO Direct Marketing Guidance – this was issued to clarify ICO’s requirements for
compliance
Other EU members have their own data protection regulations
The current UK regulation is ‘light touch’ compared to some others regimes
Future Regulation will be totally covered under New GDPR which is the regulation will take
effect after a two-year transition period and, unlike a Directive it does not require any enabling
legislation to be passed by government; meaning it will be in force May 2018.
When you hear the word privacy, what comes to your mind?
N=100
Bodily privacy (e.g. your physical body) 32.54
Communication privacy (e.g. calls received or dialed 48.33
through telephone)
Information privacy (e.g. information exchanged on the 51.24
Internet)
Territorial privacy (e.g. your living space, working space) 31.59
All of the above 28.43
Others 0.68
Which of the following information is personal to you that you would NOT like to share?
N=100
Annual house hold income 53.64
Bank account details 64.63
102 | P a g e
Credit card number 68.18
Date of birth 5.80
Email address 6.25
Family details 14.86
Full name 2.03
Health and medical history 27.17
Landline number 8.42
Marital status 3.95
Mobile number 13.45
Passport number 64.45
Passwords 88.39
Personal income 62.77
Pictures and videos featuring self 18.67
Physical details - height, weight, eye color 8.47
Postal mailing address 5.51
Religion 2.70
All of the above 3.65
Others 2.00
Does privacy for you change with situation and context, i.e. what information you share with whom may
be different at different time and context?
N=100
Yes 52.79
No 17.83
May be 29.38
103 | P a g e
With whom would you share the following information?
Bank Account Details 7.53 47.82 6.83 1.10 24.24 14.14 1.14 43.46
Credit card number 5.02 38.65 3.75 0.91 10.66 5.05 1.31 54.66
Date of birth 31.38 40.03 27.91 7.59 13.54 12.85 54.54 1.67
Email address 32.36 39.69 26.74 6.44 11.85 9.98 51.57 1.81
Family details) 35.47 50.46 39.22 6.47 8.17 11.57 32.24 5.54
Full name 20.74 26.45 17.93 5.92 9.75 10.09 68.98 1.17
Health and medical history 26.79 59.66 27.96 2.85 3.99 5.97 14.42 17.94
Landline number 36.45 44.41 36.57 5.63 11.38 8.74 47.34 2.11
Marital status 27.06 34.05 25.14 7.84 9.58 10.02 57.89 2.15
Mobile number 39.54 43.80 34.90 5.57 16.54 12.54 46.31 3.79
Passport number 9.00 34.49 6.85 1.35 5.36 11.22 4.2 55.73
Passwords 2.1 13.38 1.3 0.52 0.80 1.01 1.24 83.97
Personal income 12.1 41.99 9.43 1.43 5.04 6.16 2.9 51.55
Pictures and videos featuring 44.81 56.53 34.74 3.02 2.12 3.47 23.75 10.09
self
Physical details- height, weight, eye color 33.97 50.75 29.84 3.74 3.74 6.14 34.85 6.19
Postal mailing address 32.05 41.29 29.46 6.00 15.04 15.46 47.16 2.77
104 | P a g e
Imagine you are walking through a shopping mall, where you observe a camera capturing the movements
of people in the shops, what would be your reaction? (Choose one which is applicable)
N=100
Strongly Agree Neutral Disagree Strongly
agree Disagree
Consumers have lost all control 23.66 52.97 15.94 5.84 1.13
over how personal information about
them is circulated and used by companies
Most businesses handle the personal 13.76 44.87 27.37 13.49 1.97
information they collect about consumers in a
proper and confidential
way
Mobile phones can be privacy invasive 14.42 55.17 21.35 7.75 0.63
Landline phones can interfere with 14.29 45.4 27.18 11.39 1.00
individuals privacy
Websites can hinder privacy by collecting 20.23 50.99 20.60 6.41 0.84
personal information
Credit cards can be privacy invasive 18.98 44.93 24.99 9.40 0.93
Phone banking can invade privacy 15.95 47.24 24.71 9.14 1.57
105 | P a g e
Consider a scenario where you visit a coffee shop which provides a free Wi-Fi connectivity for its
customers. It doesn’t ask for password for connectivity. Would you access the Wi-Fi facility to log-in
your email?
N=100
Definitely would 19.72
Probably would 33.88
Not sure 11.58
Probably not 10.56
Definitely not 24.26
Imagine for checking your results of an entrance exam you went to the institute and saw that the results
have been displayed on a notice board with your name, marks and category (general / OBC / SC)
mentioned.
While traveling in long-distance trains, a reservation chart with details e.g. last name, first name, age,
gender, boarding station, destination, seat number, PNR number for each passenger is displayed on the
platform and the compartment. How would you feel about your information being displayed as in this
scenario?
N=100
Always feel comfortable 36.74
Usually feel comfortable 43.43
Sometimes feel comfortable 8.06
Rarely feel comfortable 4.45
Never feel comfortable 7.33
106 | P a g e
Section 2: Mobile Privacy
What is the personal information which you don’t mind storing in your mobile phone?
N=100
Business related information (meeting details) 24.68
Credit card number(s) / ATM card number(s) / PIN 26.20
number(s)
Information e.g. date of birth, PAN number, ID number, 30.51
account number
Password(s) 25.00
Videos, photographs etc. 64.57
All of the above 10.40
Others 2.29
What are the reasons for which you don’t store personal information on your mobile phone?
N=5,925
Worried about phone being stolen / lost 40.08
Concerned about somebody accessing the phone at work, or 38.51
outdoors without permission
Concerned about somebody accessing the phone at home 24.86
without permission
Don’t feel the need 34.16
All of the above 9.91
Others 0.41
107 | P a g e
Imagine you visited a mobile service provider (e.g. Vodafone) to buy a new mobile connection; they asked you
to fill a form giving details e.g. name, date of birth, ID proof. Which of the information given below you
would share with them, if they are NOT mandatory fields?
N=100
Alternative address proof 42.46
Another contact number 33.71
Educational qualification 24.56
ID proof 67.18
Permanent address proof 30.12
Photograph(s) 63.51
Proof of place of work 18.19
Parents details 8.33
All of the above 8.23
None of the above 4.00
Others 0.26
108 | P a g e
Do you use phone banking services to check your balance in the account?
N=100
Yes, it is safe to use 15.73
Yes, because I don’t have a choice 8.53
No, because I fear information may be leaked through phone 21.11
tapping
No, because I am not sure of who is on the other side 33.93
Others 20.69
Would you use phone banking services to transfer money from your account?
N=100
Yes, it is safe to use 12.77
Yes, because I don’t have a choice 6.89
No, because I fear information may be leaked through phone 22.71
tapping
No, because I am not sure of who is on the other side 37.34
Others 20.29
While exchanging information on mobile phone, what according to you, is the extent of confidentiality
provided by the mobile service provider for information being exchanged?
N=100
Very high 11.49
High 37.50
Neutral 24.60
Low 9.77
Very low 1.69
I don’t know 14.96
109 | P a g e
What do you do before you sell your mobile phone?
N=100
Copy the details and other information from SIM card and 12.64
phone memory
Copy the information from SIM card and phone memory 40.40
and then delete information
Delete all information that is stored in the mobile phone 31.30
Delete only specific details e.g. phone numbers and messages 4.89
Dont do anything 6.68
Others 4.08
While exchanging information on land-line phone, what according to you, is the extent of confidentiality
provided by the land-line service provider for information being exchanged?
N=100
Very high 8.25
High 37.41
Neutral 26.51
Low 10.98
Very low 1.84
I don’ t know 15.01
While moving in a shopping mall, imagine you see somebody taking your picture using a mobile phone,
what would be your reaction?
N=100
No reaction 32.27
I don’t like a stranger taking my picture without my permission 48.94
110 | P a g e
While travelling (i.e. in roaming), the mobile service providers use regional languages to present
information e.g. user busy, phone switched o↵. For example, if your phone connection is from Delhi and
if you are traveling in Mumbai, the messages are presented in Marathi. Would you consider this feature as
privacy invasive?
N=100
Strongly agree 10.02
Agree 43.97
Neutral 22.94
Disagree 19.24
Strongly disagree 3.83
N=100
Yes 57.18
No, but I have used them 11.94
No, I don’t use them at all 30.88
N= 100
Yes 34.58
No 65.42
111 | P a g e
To whom do you lend your credit card for using it?
N=100
Children 10.38
Friends 26.33
Parents 53.68
Relatives 3.94
Professional Colleagues 10.24
Spouse 29.23
None 0.88
Others 2.91
N=100
Children 4.90
Friends 20.47
Parents 43.11
Professional colleagues 2.43
Relatives 5.59
Spouse 19.10
None 8.85
Others 20.02
N=100
Children 7.83
Friends 27.19
Parents 57.15
Professional colleagues 6.44
Relatives 9.15
Spouse 16.50
None 10.17
Others 2.82
112 | P a g e
What is true for you with respect to using credit cards in today’s world?
N=100
It is unavoidable 20.12
It is handy; use it frequently for various purposes e.g. shop- 39.70
ping, petrol pumps, and grocery shops
Use only for specific tasks e.g. online ticketing 17.04
Use as a back-up for emergency situations 22.70
Others 0.44
Imagine that you went to a restaurant to have food with your friends / family. Which of the following is
true if you make the payment of the bill through your credit card?
N=100
You would give the card to waiter, for making the payment 19.31
If you can go yourself, you would take the card yourself to 39.94
cash counter, get it swiped in front of you
If you cannot go yourself, you would give the card to waiter 14.45
and check the details of bill carefully
You would give it only to the waiter only if it’s a trustworthy 9.14
restaurant
You know it can be misused, but cannot do anything about 3.83
it
You would not like to use credit card to make the payment 13.11
Others 0.21
Do you think credit cards should display the details e.g. name, phone number, and date of birth on them?
N=100
It should not display any personal information, as it makes 32.75
information public
44.80
It should display only relevant details required for identification
14.68
It should display all details as these are required for verification
It does not bother me 7.36
Others 0.41
113 | P a g e
Do you think it is possible for anybody to steal your identity and impersonate you, using your credit card?
N=100
Yes, it is fairly easy 44.52
Yes, but it’s not easy 35.58
No, it’s not possible under any circumstance 14.53
I have never thought about it 5.00
Others 0.37
Imagine you go to withdraw money from ATM; while you are withdrawing money, you notice other
people peeping into the ATM while you enter the PIN. How would you consider entering details of your
account in this scenario?
N=100
Definitely would 17.76
Probably would 30.95
Not sure 9.25
Probably not 16.17
Definitely not 23.15
I have no other choice 2.72
Imagine you go to withdraw money from ATM; how would you consider using the ATM center if there are
two machines in the same center and someone else is using the other machine?
N=100
Definitely would 25.78
Probably would 39.09
Not sure 10.74
Probably not 7.98
Definitely not 13.09
I have no other choice 3.32
114 | P a g e
Section 4: Internet and Online Social Network
Have you ever removed cookies in your browser after using the Internet?
N=100
Often 22.24
Sometimes 33.99
Hardly ever 5.13
Never 23.36
Not familiar with cookies 7.57
Don’t know 7.71
N=100
Gmail 80.18
Hotmail 24.78
Official email 16.16
Yahoo mail 49.66
Do not use any email services 4.77
Others 2.76
Do you exchange personal information, e.g. bank account numbers, passport details through your email?
N=100
Yes, frequently 14.21
Sometimes 26.51
Only in emergency 18.26
No, not at all 38.52
I don’t remember 2.51
Do you save personal information, e.g. bank account numbers, passport details in your email for future
use?
N=100
Yes, frequently 15.36
Sometimes 24.35
Only in emergency 12.97
No, not at all 44.57
I don’t remember 2.75
115 | P a g e
What are your privacy concerns while exchanging / saving personal information through email services?
N=100
I have no concerns 18.75
I believe that the privacy of my data is maintained 40.70
I am concerned, but I do not have a choice 22.26
I am concerned so I don’t save/exchange 14.68
Dont know 3.60
Do you read the privacy policy of an e-commerce website e.g. PayPal, eBay, bank websites while creating
an account?
N=100
Yes, I do 34.00
I browse through it 33.91
Never 24.29
Don’t remember 6.99
Others 0.81
Do you read the privacy policy of an email provider while creating an account?
N=100
Yes, I do 34.48
I browse through it 31.20
Never 27.63
Don’t remember 6.29
Others 0.40
N=100
Yes 85.1
No 14.89
116 | P a g e
What privacy settings do you have for the following information on Facebook? Please provide your
response to the best of your knowledge.
Do you read the permission box that appears when you first access any third party application e.g.
FarmVille, Mafia Wars?
N=100
Yes, I see but I don’t read it and just allow, otherwise I 22.22
cannot access the application
Yes, I read the permissions the application asks, but always 28.44
“allow” the application
Yes, I read the permissions the application asks, and accordingly decide to “allow” or 19.34
“not allow” the application
I do not remember seeing any such permission box ever 9.61
No, I will never allow third-party application to access my 7.92
personal information
No, I don’t use third party applications 12.47
117 | P a g e
When would you use third-party applications on an online social network?
N=100
When a friend recommends an application 40.56
When I see the application on my friend’s news feed / wall 37.66
When online social network recommends an application 18.15
When I randomly stumble on some interesting application 19.01
Others 6.07
Have you connected / inter-linked your various social networking accounts together e.g. Face-
book, Twitter, YouTube, Buzz, Orkut?
N=100
Yes 46.15
No 44.91
I do not know of any such linking service 8.94
Do you think it is possible for somebody to steal your identity on your social network website
i.e. create a profile with your name, pictures and details?
N=100
Yes, it is possible, but it has never happened to me 60.39
Yes, it is possible, and it has happened to me 24.03
No, it is not possible 10.07
Don’t know 5.51
Does the Indian constitution have a provision for privacy of Indian citizens?
N=100
Yes, I know about it 34.44
Yes, but I don’t know what it is 27.61
Not sure, I assume there is a provision, but, I am not aware 22.63
of it
I do not know about this kind of a provision 12.26
No, there is no provision 3.06
118 | P a g e
Do we have privacy laws in India that protect Indian citizen’s privacy?
N=100
Yes 49.01
No 27.34
Not sure 23.65
Are you aware of Unique Identification Number (UID), a Government of India initiative for
every citizen in India?
N=100
Yes 67.23
No 21.11
Heard about it, but do not know the details 11.66
119 | P a g e
CONCLUSION
The concept of privacy in India has not been investigated in detail, and also lack of
empirical data with respect to privacy perceptions among Indian citizens. Recent
developments in the Indian scenario e.g. privacy bill, UID project, signify need for
privacy awareness and understanding in Indian masses. It is also important for policy
makers to comprehend sentiment and opinion of masses for structuring executive laws
and policies for citizens of India. Our study focuses on understanding privacy
perceptions and expectations of Indian citizens. In the first phase, we conducted
interviews among 20 participants and 4 focus group discussions with 31 participants in
total, to collect qualitative data about the privacy perceptions.
Citizens have misinformed mental models of the privacy situation; e.g. some
portion of the participants felt that there is a law which protects them where there
is no privacy law in India, but.
About 5% of the survey participants tend to accept friends request from strangers
or people whom they don’t know, but just have common friends. This behavior
seems to be same even with the third party applications.
About 80% of the survey respondents were aware of identity theft issue through
credit cards. 48
About 65% of the survey respondents felt comfortable to use the ATM center
with more than one machine in it.
120 | P a g e
About 5% of the survey participants tend to accept friends requests from
strangers or people whom they don’t know, but just have common friends.
This behavior seems to be same even with the third party applications. We are in the
process writing a more academic style paper on reasons, and implications of the results
from this data.
People are increasingly making their personal information available publically. Today
there is an unprecedented amount of personal data available with Government and
Private Sector Players. We need to understand the importance of this data and India
should try and develop stringent laws for Data protection.
121 | P a g e
Glossary
http://www.eugdpr.org/
http://www.livelaw.in/data-protection-india/
http://www.isaca.org/Groups/Professional-English/privacy-data-
protection/Pages/Overview.aspx
122 | P a g e