You are on page 1of 9

PRIVACY IN TECHNOLOGY

ONLINE TRAINING TRANSCRIPT


MODULE 5: PRIVACY ENGINEERING

Introduction
Introduction

In module 4, we explored the tactics and strategies for countering a variety of threats to privacy. In
module 5, we define the role that privacy engineering plays in protecting privacy throughout an
organization, including key concepts and objectives of privacy engineering, as well as privacy design
patterns that should be used or avoided.

The privacy engineering role in the organization


Learning objective

• Examine the role of the privacy engineer in an organization

Privacy engineering (1)

The National Institute of Standards and Technology’s internal report, An Introduction to Privacy
Engineering and Risk Management in Federal Systems NIST Internal Report 8062, defines privacy
engineering as a “specialty discipline of systems engineering focused on achieving freedom from
conditions that can create problems for individuals with unacceptable consequences that arise from the
system as it processes PII.” In comparison, software engineering can be described as a disciplined
approach to the construction of software consisting of well-trained professionals and well-established
processes. Privacy engineering brings the complementary perspectives and practices of software engineers
and privacy professionals together and is grounded in practical solutions capable of measuring and
monitoring the ever-changing state of privacy within information technology.

Privacy engineering (2)

Technology, central to modern privacy programs, must have scalable technical solutions, developed to
address both the ever-increasing scope, volume, and complexity of information technology, as well as the
challenges of translating privacy principles and harms into engineering requirements. Privacy engineering
encompasses how privacy values and principles can be applied in technology systems and programs while
also recognizing the level of risk involved in securing personal information. Select “Next” to read more
about privacy engineering.

Privacy engineering (3)

©2022, International Association of Privacy Professionals, Inc. (IAPP)


2

Privacy is often the focus in information-intensive systems, such as web-based retail, reservation systems,
electronic health records, and social networking. However, privacy also concerns systems that may not
directly collect or use personal information, but still affect the personal development, free expression and
movement of individuals throughout society such as transportation systems and electronic voting systems.
Because of the personal nature of privacy, it is critical that privacy technologists consider the role that
software plays in privacy and take steps to design systems to protect individual privacy—this is the
function of privacy engineering.

Key concepts of privacy engineering

Implementing effective privacy engineering within an organization relies on three key concepts: (1) data
governance, (2) technological controls, and (3) the engineering life cycle. These three elements help
define how privacy technologists can implement privacy safeguards in ways that are both measurable and
practical and help create a culture of privacy among an organization’s privacy and software engineers.
Select each concept to review.

Data governance:

Understanding personal and nonpersonal data, how it is used, and the privacy risks for any given
data set is essential to creating effective safeguards that are aligned with privacy objectives within
a technology ecosystem. Modeling data and use and continually monitoring data are key to data
governance but challenging for privacy governance. It requires stakeholders across the
organization to come together to create a common taxonomy for data for it to be tracked and
managed. A model for creating a taxonomy includes:

• Identifying the business objectives, purposes and uses for data


• Knowing the law and policies that shape the limits of what is allowable and what is not
• Implementing the appropriate technology for safeguarding data

Technological controls:

In order to create technology-centric privacy governance programs, it is essential to link


established internal controls to technological controls. Privacy engineering is the result of
translating internal controls into technology. Technological controls may include:

• Access control points: allowing limited users with a legitimate access need
• Dataflow control points: minimizing the amount of data collected and shared
• Retention control points: deleting data no longer needed

Engineering life cycle:

The engineering life cycle helps build solutions to support business objectives that are both reliable
and scalable to the organization’s needs. When privacy is built into the engineering life cycle
through the standardized tools and infrastructure of the technology ecosystem, engineers can
better implement privacy-protective solutions that are in line with business needs and data
subjects’ expectations. Privacy then becomes a core component of business solutions. Embedding
privacy into the engineering life cycle translates privacy into the terminology, practices and culture
of engineers. As privacy becomes more meaningful to software developers, the practice of privacy
engineering is realized through the natural and scalable enforcement of privacy safeguards within
technology solutions.

Summary

• Due to the personal nature of privacy, it is critical that privacy technologists consider the role
software plays in privacy and take steps to design systems to protect individual privacy, which
is the function of privacy engineering.

©2022, International Association of Privacy Professionals, Inc. (IAPP)


3

• Implementing effective privacy engineering within an organization relies on three key concepts:
(1) data governance, (2) technological controls, and (3) the engineering life cycle.
• Understanding personal and nonpersonal data, how it is used, and the privacy risks for any
given data set is essential to creating effective safeguards that are aligned with privacy
objectives within a technology ecosystem.
• When privacy is built into the engineering life cycle, engineers can better implement privacy-
protective solutions that are in line with business needs.
Review

1. Privacy engineering addresses the challenges of translating privacy principles and harms into
engineering requirements. What key concepts within an organization help realize this? Select all that
apply.

Data governance
Privacy design patterns
Technological controls
Engineering life cycle
Manageability

Privacy engineering objectives


Learning objective

• Examine the use and key impacts of privacy engineering within an organization

Privacy engineering objectives

NIST’s Privacy Engineering Program has proposed three privacy engineering objectives: predictability,
manageability and disassociability. These objectives are intended to be privacy’s version of the traditional
security objectives of confidentiality, integrity, and availability (or CIA). The following slides take a closer
look at each objective.

Predictability

Predictability characterizes reliable assumptions about a system, particularly its data and the processing of
that data by all stakeholders. These stakeholders include not only the individuals to whom the data
pertains, but also system owner and operators. Predictability allows for:

• Privacy principles within a system that are measurable. For example, providing notice and
requiring users to check a box stating they have read and agreed upon that notice.
• Stakeholders who can adequately describe what is happening with the personal information in
their possession from a value statement on transparency, to a requirements-based program
that explains how personal information is managed.
• Privacy controls that can expand beyond privacy notices. For example, the use of
deidentification techniques that demonstrate how a system protects individual identity when
their information is shared.
• Trusted relationships between stakeholders and individuals, thereby enabling operators to
implement innovative changes to a system to provide better services.

Manageability

Manageability is the ability to granularly administer personal information, including modification,


disclosure and deletion. This allows systems to:

©2022, International Association of Privacy Professionals, Inc. (IAPP)


4

• Confidently ensure corrections can be made to inaccurate information, only necessary


information is collected or disclosed, and privacy preferences are properly implemented and
maintained.
• Assign appropriate stakeholders to administer changes to an individual’s information.
• Support any technical measures necessary to protect identity.

Disassociability

Disassociability is the minimization of connections between data and individuals to the extent compatible
with system operational requirements. This minimization can take many forms, from maximally
disassociated data in the form of high-level aggregated data, to deidentified records pertaining to distinct
individuals. Disassociability can also take the form of architectural data separation, in which identifiable
personal information is kept segregated from, but still linkable to, transactional data. This allows the
system to:

• Support more accurate control mapping and risk mitigation through the construction of a
taxonomy of identity classifications such as pseudonymity, de-identification and anonymity.
• Increase the need for advances in techniques that disassociate individuals from their
information.

Perspectives: Predictability, manageability and disassociability in the real world

Mark Webber, CIPP/E, U.S. Managing Partner, Fieldfisher, IAPP Faculty Member

Privacy engineering principles of predictability, manageability and disassociability. Let’s bring it to life in
the world of cookies and start with the concept of predictability. When we talk about predictability, we’re
talking about making reliable assumptions about a system: what data is collected, how is it processed,
and what people are involved. In the real world, let’s think about cookies. I can’t imagine there’s one of
the viewers today that hasn’t clicked on a cookie banner and either thought about it or at least read some
of those disclosures. In the real world, predictability is putting up a cookie banner, and mapping that
cookie banner to make it work and automate back into our system. It gives notice that a cookie is being
served. That banner has a mechanism in the background to capture consent and record and evidence that
consent. We’ve got evidence that we have consent. We’ve got some predictability, because we said, “Yes,
this user has given consent, therefore we can use a cookie.” This transparency within the banner which
matches to how that is used: If we’ve been transparent about how that data is used, it gives us a process
that we can overlay to make sure that data is used in a predictable manner. And then it also results in
privacy controls. Removing consent or stopping sharing data is another way of predicting. An individual
could remove their consent and then you could make reliable assumptions that, “Well, I’ve not got a
consent anymore, I need to stop that, I need a process to make sure this data is ring-fenced and not
used.”

And that’s when we move into manageability. Essentially, having granular control as an administrator over
the personal data which is being collected. “Can I modify it? Should I disclose it? Can I delete it?” Well,
that goes back to whether or not we’ve collected it in the right way. The cookie banner has collected that
information in a way that it can be used and disclosed in accordance with the disclosures in that cookie
banner. But it can also be deleted or prevented in another way. So, you’ve got some manageability in
terms of access tools and rights tools, but you’ve also got manageability, potentially, in terms of, “I’ve
collected that information [and] I might need to use it at a later date.” Somebody may come in and say,
“I’ve got an access right and you’ve got my personal data.” So, you might need a manageability function
of can I search that data to go and find that individual by matching the cookie they give me with the
cookie I’ve got on my system. If I can find that cookie, I can then go and delete it, because I’ve
introduced manageability. So, I can support privacy preferences, and I can support rights, when I’m
thinking about manageability.

And then, at another level, we’ve got disassociability; essentially, minimizing connections between data
and individuals. Now, a cookie is already one step removed from a direct identifier, but it remains an

©2022, International Association of Privacy Professionals, Inc. (IAPP)


5

identifier. But where possible, the concept of disassociability means removing even more data. Maybe you
can go ahead and pseudonymize that cookie, maybe you can hash it, or maybe because it’s associated
with an IP address you can remove the last octet of that IP address and make it less identifying. We don’t
want to remove the usability of that information, but by disassociating certain traits, we can make it less
identifiable, and that can have benefits in terms of, “Do we still have a notice obligation to those
individuals because it’s no longer fully personal and it’s been pseudonymized, so if we had an incident, we
might get away with having to notify individuals because the law says that we don’t?” Or it might also just
be privacy-enhancing. If others got hold of that information, we’re no longer looking at easily identifiable
information. Something’s been performed to that information in a way that disassociates identity and
makes it more usable and more privacy protective. That probably gives us more basis to use that
information internally, and more justification, because we’re minimizing the risks to that data at every
stage by disassociating it.

So, thinking about those principles in the world of cookies, we’ve thought about a way where we can deal
with predictability, we can deal with manageability, and that disassociability. We’re thinking about it and
applying it in a real world. It really means granular understanding of the way data is used and the way it
can be managed and protected through different techniques.

Determining privacy capabilities

In addition to three privacy engineering objectives, NIST suggests companies determine their privacy
capabilities. “Privacy capabilities can be used to describe the system, product, or service property or
feature that achieves the desired privacy outcome.” To determine these privacy capabilities, the privacy
engineering objectives, in the table shown, may be used to assess support needs.

Determining organizational privacy engineering and security needs depends on the company’s business
objectives as well as risk tolerance. Depending on the organization, some needs may be more important
than others and may be highlighted in risk assessments or known through current or emerging risks.

Privacy Engineering and Security Objectives

©2022, International Association of Privacy Professionals, Inc. (IAPP)


6

Summary

• NIST’s Privacy Engineering Program has proposed three privacy engineering objectives intended
to be privacy’s version of the traditional security objectives of confidentiality, integrity, and
availability (or CIA).
• Predictability characterizes reliable assumptions about a system: its data and the processing of
that data by all stakeholders.
• Manageability is the ability to granularly administer personal information, including
modification, disclosure and deletion.
• Disassociability is the minimization of connections between data and individuals to the extent
compatible with system operational requirements.
• Privacy capabilities help organizations achieve desired privacy outcomes.

Review

1. How does employing the objective of predictability benefit an organization? Select all that apply.

It increases the need for advances in techniques that disassociate individuals from their
information
It assigns appropriate stakeholders to administer changes to an individual’s information
It supports trusted relationships between stakeholders and individuals, thereby enabling
operators to implement innovative changes to a system to provide better services
It helps stakeholders adequately describe what is happening with the personal information in
their possession from a value statement on transparency to a requirements-based program that
explains how personal information is managed

2. True or false? Manageability includes allowing individuals to have access to their information to make
changes to inaccurate information.

True
False

Privacy design patterns


Learning objective

• Understand the role of privacy design patterns within the field of privacy engineering

Privacy design patterns

Design patterns were introduced into mainstream software engineering through object-oriented
programming where the data structure is the object. The term “design pattern” describes shared solutions
to recurring problems. These design patterns serve to improve program code maintenance by providing
developers with a common mental model when approaching a recurring problem.

There are four elements of a design pattern:

1. A pattern name which references the pattern.


2. A problem description that describes what is intended to be solved, including sufficient information
to recognize when the pattern applies.
3. A solution that describes the elements of the design, their relationships, roles and interactions.
4. The consequences that describe the results from applying the pattern and any trade-offs that occur
by using or not using the pattern. This information assists the designer in determining whether the
pattern’s benefits are an improvement to the design.

©2022, International Association of Privacy Professionals, Inc. (IAPP)


7

These design patterns can be repeatedly used to solve problems and multiple patterns can be combined to
yield more robust solutions. Select the markers to review the elements of a design pattern.

Design patterns to emulate

There are many design patterns to copy that can be used to improve development of systems. While they
improve software speed and development, privacy patterns are also emerging, and design patterns can be
used in the same way to address those issues. The University of California, Berkeley School of Information
is working to identify privacy patterns, which they define as “design solutions to common privacy
problems.” For example, a common problem is when users share information from a site with a person
who is not a user of the site, such as with Google Docs, Dropbox or social media sites. A design pattern
used to solve this would be to employ user link sharing, in which a deep randomized link is created by the
site to then be sent to the non-user, thereby protecting the non-user’s personal information.

Dark patterns

Dark patterns are recurring solutions used to manipulate individuals into giving up information. “Dark
pattern” was named by Harry Brignull, a UX specialist, to describe the ways in which software can subtly
trick users to take an unintended action or discourage a behavior that is harmful to the company, such as
unsubscribing from marketing campaigns. For example, a company may hide the unsubscribe button, or
make it difficult to display on a screen, as a sign that unsubscribing is discouraged. There are schemes
used in decisional interference, discussed in module 3, and often use psychology to persuade users,
appealing to a need to belong or nudging users to choose options that benefit the service provided by the
website, such as asking for access to a user’s address book when creating their profile. Knowing these
patterns can give designers the opportunity to use strategies for users to be better informed and have
greater control of their information. Select “Next” to read more about dark patterns.

Dark patterns to avoid (1)

Darkpatterns.org identifies and defines twelve of the most common dark patterns from the perspective of
the user. Select each of the six below for a brief description and then select “Next” to read about the other
six:

Trick questions: While filling in a form you respond to a question that tricks you into giving an
answer you did not intend. When glanced upon quickly the question appears to ask one thing, but
when read carefully it asks another thing entirely.

Sneak into basket: You attempt to purchase something, but somewhere in the purchasing
journey the site sneaks an additional item into your basket, often through the use of an opt-out
radio button or checkbox on a prior page.

Roach motel: You get into a situation very easily, but then you find it is hard to get out of it (e.g.
a premium subscription).

Privacy zuckering: Default privacy settings that are made complex for the end-user by poorly
presenting the available settings, encouraging users to reveal more information than intended.

Price comparison prevention: The retailer makes it hard for you to compare the price of an item
with another item, so you cannot make an informed decision.

Misdirection: The design purposefully focuses your attention on one thing in order to distract your
attention from another.

Dark patterns to avoid (2)

Darkpatterns.org identifies and defines twelve of the most common dark patterns from the perspective of
the user. Select each of the six below for a brief description:

©2022, International Association of Privacy Professionals, Inc. (IAPP)


8

Hidden costs: You get to the last step of the checkout process only to discover some unexpected
charges have appeared, e.g., delivery charges, tax, etc.

Bait and switch: You set out to do one thing, but a different, undesirable thing happens instead.

Confirmshaming: The act of guilting a user to opt into something, like a mailing list or text
messages. For example, it has become more common that when visiting a website a pop-up
window appears to enter your name and email address in exchange for a discount. However, if you
do not, then a form of shaming occurs such as, “No, I will pay full price.” Most users want the
discount and do not want to pay full price, so they are “shamed” into providing their contact
information.

Disguised ads: Adverts that are disguised as other kinds of content or navigation, in order to get
you to click on them.

Forced continuity: This occurs when a credit card is charged, without warning, after a free trial
ends. For example, signing up for 30 days of free access to an online newspaper but a credit card
is required to start the trial. After the free trial ends, the credit card is automatically charged
without notification to the user.

Friend spam: The product asks for your email or social media permissions under the pretense it
will be used for a desirable outcome (e.g., finding friends), but then spams all your contacts in a
message that claims to be from you.

Summary

• Design patterns serve to improve program code maintenance by providing developers with shared
solutions to recurring problems.
• There are many design patterns for privacy engineers to copy that can be used in improving
development of systems.
• Privacy patterns are emerging, improving software speed and development.
• Design patterns can be used in the same way to address those issues.
• Dark patterns are recurring solutions used to manipulate individuals into giving up information.
They are schemes used in decisional interference and often use psychology to persuade users,
appealing to a need to belong or nudging users to choose options that benefit the service provided
by the website.

Review

1. True or false? Dark patterns are schemes used in decisional interference.

True
False

2. What element of a design pattern describes the components of the design, their relationships, their
roles and how they interact?

Pattern name
Consequence
Problem description
Solution

Review answers

The privacy engineering role in the organization

©2022, International Association of Privacy Professionals, Inc. (IAPP)


9

1. Data governance; Technological controls; Engineering life cycle


Privacy engineering objectives
1. It supports trusted relationships between stakeholders and individuals, thereby enabling
operators to implement innovative changes to a system to provide better services; It helps
stakeholders adequately describe what is happening with the personal information in their
possession from a value statement on transparency to a requirements-based program that
explains how personal information is managed
2. False
Privacy design patterns
1. True
2. Solution

*Quiz questions are intended to help reinforce key topics covered in the module. They are not meant to
represent actual certification exam questions.

©2022, International Association of Privacy Professionals, Inc. (IAPP)

You might also like