You are on page 1of 59

VALUES FOR SMART PRODUCTS

Lecture 3 RESTS
May 15th, 2023
Dr. Jan Peter Bergen
Today’s lecture
 Ethical challenges of smart products: examples
 Value 1: Justice
 Costs, benefits and justice
 Distributive justice
 Procedural justice
 Value 2: Inclusiveness
 Inclusiveness as accessibility
 Inclusiveness as representation
 Value 3: Privacy
 Why privacy is important
 Kinds of privacy
 Privacy as contextual Integrity (a baseline for design)

Goal of today’s lecture: to gain conceptual understanding of central values, and of


how they affect the design of (smart) products
The smart invasion
Amazon’s ‘smart’ hiring (2014-2018)
COMPAS recidivism estimation

 COMPAS AI (a.o., NY, WI, FL,…):


system to precit recidivism in criminal
cases (influences incarceration
before actual court hearings, with its
own set of consequences)

 ‘Race’/Etnicity was not an explicit


category in the input data
Predictive policing

 E.g. the PredPol crime prediction


system focuses on neighbourhoods
with lots of minorities and basis its
estimation- on reported incidents
instead of the actual number of
crimes (self-fulfilling proficy)
Self-driving cars and moral dilemmas
• A two doors car designed at an
unprecedented pace in 1970s.

• A technical error, i.e., the gas tank


was situated behind the rear axle.
Rear-end accident (at >35 km/h)
could rupture gas tank, with a
disastrous fire for passengers in the
backseat.

• Ford was aware of the problem, just


before releasing the Pinto in the
market, but chose not to repair. A
decision they
later defended in the court using a
simple Cost-Benefit Analysis
Value 1:
Justice
Distributing fairly
Distributive Justice, then!

 Part of theories of justice (i.e., theories that formulate criteria to


assess the fairness of the distribution of material and non-material
goods in society)
 Any self-respecting theory of distributive justice has at least these
two elements:
 A determination of what is to be distributed (distribution of what?). E.g.,
 Income, opportunity, outcomes, rights and liberties,…
 A principle for distribution, e.g.,
 Egalitarian, meritocratic, hierarchical,…

 This leads to some idea of what a ‘just’ distribution pattern looks like
Rawls’ theory of justice

John Rawls, A Theory of justice, 1971

Reference:
 “Contractualism” (Hobbes, Locke)  which political rules would
people agree upon in the absence of the State (in the “state of
nature”)

Driving idea:
 finding political principles that rational agents with different
moral convictions would accept under ideal conditions
Rawls’ theory of justice

Basic concepts:

 “original position”: we should think of agents as deliberating about


the basic rules of a (multicultural) society …
 “veil of ignorance”: … while not knowing what their own social
position in that society will be (whether they will be rich/poor,
talented, part of majority/minority culture etc.)
 “overlapping consensus”: … in order to find some general principles
of justice on which they would agree
 “Social and Natural Lottery”: one’s social/political/economic
position in society is to significant degree determined by
happenstance, by luck (or lack thereof)
Rawls’ theory of justice

Two principles of justice:

1) First Principle: Each person has the same indefeasible claim to a


fully adequate scheme of equal basic liberties, which scheme is
compatible with the same scheme of liberties for all. E.g.,
 Freedom of thought
 Freedom of speech
 Right to vote
 Etc.
Rawls’ theory of justice

Two principles of justice:

2) Second Principle: Social and economic inequalities in the distribution of


primary goods (e.g., health, income, mobility, the social bases of
self-respect, etc.) are to satisfy two conditions:
a) They are to be attached to offices and positions open to all under
conditions of fair equality of opportunity;
b) They are to be to the greatest benefit of the least-advantaged
members of society (the difference principle). (JF, 42–43)
Bit of Practice

 Distribution of what? Which


‘goods’ distributed?
 Which of these is subject to a
distributive principle?
 What is the guiding
distributive principal?
 What is the role of luck/the
lottery?
Is Werewolves ‘fair’?

 Distributionally? Minimally…
 Some players have a far higher chance of losing (villagers, doctor, seer),
one hasn’t even got a chance to win or lose (the moderator), some have more
agency in how the game develops (werewolves, seer), etc. Little equity in
distribution.

 So why is this (presumably) not unfair?

 No real risk so no real gains or losses?


 Many games possible so evens out?
 And/or, is it because everybody agreed to the rules and the possible
inequalities?
Procedural justice, then!

 Part of theories of justice (i.e., theories that formulate criteria to assess the
fairness of the distribution of material and non-material goods in society)
 But instead of the fairness of distributions, it focuses on the fairness of the
procedures to reach decisions on what to do (e.g., how to distribute)
 A fair decision-making procedure could justify unequal distributions of
goods (e.g., indivisible risks)
Example: Dessel, Belgium

 Above-ground storage of low- and medium-level, short-lived nuclear


waste (under construction)

 Local population is not only involved in decision-making (consent), but


also in the actual design of the facility, it’s use across the life cycle,
and the compensation efforts for local communities.

 Partnerships already underway for 15+ years

 Plans include an info/communication and


community center
Fair procedure

 A fair decision-making procedure usually shares a number of characteristics:


 Neutrality/non-bias
 Trustworthiness of authorities/decision-makers
 Respect for one another’s autonomy
 Transparency
 Education
 Participation and Voice for stakeholders

 Central concept: Informed Consent


E.g., you can’t simply impose (risky) things on someone else without
taking their own autonomous decisions into account
Value 2:
Inclusiveness
The value of pluralism and stakeholder
involvement in VSD
 Designers have individual value orientations
 E.g., you may have an implicit preference for sustainability, rather than privacy.

 Danger: I-Methodology’, or Designing for people like you.

 Engaging with stakeholders (direct and indirect) is a way to recognize and become
explicit about the values which you are designing for.

 Often not as problematic in practice as the relativism accusation seems to imply:


 E.g., members and groups often agree on the importance of many values like liberty,
wellbeing, safety, security, and justice.
 Even if they have different understandings of particular values and ways of prioritizing
values in a specific context
 Still conflict!
‘The Average Human
does not exist’

 Design is often done for a ‘normal user’

 However, humans come in many shapes,


sizes, colors, identities, sensibilities,
capabilities, etc. Difference is the norm,
not the exception

 Sometimes, these differences represent


certain impairments, e.g.,
 Color-blindness
 Tremors
 Forgetfulness
 Depression
 Wheelchair bound
 …
Intermezzo: Models of Disability

Medical Model of Disability:


Social Model of Disability:
“The Medical Model views
“This model states that disability is the inability to
disability as a defect within the
participate fully in home and community life. The
individual. Disability is an
interaction between functional limitations or
aberration compared to normal
impairments and physical and social barriers to full
traits and characteristics. In order
participation create disabling environments. The
to have a high quality of life, these
social model distinguishes between disabilities and
defects must be cured, fixed, or
impairments. Disabilities are restrictions imposed by
completely eliminated. Health care
society. Impairments are the effects of any given
and social service professionals
condition. The solution, according to this model, lies
have the sole power to correct or
not in fixing the person, but in changing our society.”
modify these conditions.”
Design-for-Inclusiveness

 There are methods and principles for Designing for Inclusiveness (see Keates,
2015 on Canvas), e.g.,
 participatory and cooperative design (involve broad range of users as equal
members of the design team) or
 contextual design (increased understanding of the context of use through
ethnographic study)
 More accessible and/or affordable methods include:
 Empathy
 User evaluation or user observation sessions
 Simulation aids
 Outsourcing
 Best practice/design guidance
Inclusiveness and Distributive Justice:
The Capability Approach

“The CA recognises the influence that a person’s environment has in


enhancing or restricting the set of opportunities that are available for his or
her to choose (Nussbaum, 2011; Robeyns, 2005). It takes into account not only
the diversity of individuals’ characteristics (e.g. preferences, values, needs,
and abilities), but also the societal structures and constraints affecting
individuals’ capacities to convert resources and opportunities into functionings.
This interaction between internal capabilities and external environment is what
Nussbaum (2011) refers to as ‘combined capabilities’.”
(Pereira et al., 2017, p. 176)
Inclusiveness-as-representation
Inclusiveness-as-representation

 Why just represent? We can also create


active engagement
Value 3:
Privacy
May someone take over the world?

• Dystopian novel by the English author George Orwell


• Published in 1949 (cold war)
• Describing a repressive political regime in UK, called English Socialism
• Control is exerted by the unique party leader Big Brother

• Via a sophisticated network of surveillance technology (cameras,


microphones, telescreens,…)
• And systematic mental manipulation (propaganda/brainwashing)
The surveillance society
According to Reiman (1995):

“privacy is the condition in which others are deprived


of access to you.”

Important considerations:

• It is about more than information


• As such, it is about more than data!
• It is not defined by control!

• Nevertheless, ‘informational privacy’ increasingly relevant and “is concerned


with the interest of individuals in exercising control over access to information
about themselves […] Think here, for instance, about information disclosed on
Facebook or other social media. All too easily, such information might be
beyond the control of the individual.” [1]
Why is Privacy
important?
The value of privacy

 Reasons for why privacy is important


(not an exhaustive list):

• It is a fundamental Human Right


• Harm prevention
• It is necessary for forming social relationships and intimacy
• It expands/safeguards liberties
• It exemplifies respect for our human dignity
• It helps us develop into mature individuals with
autonomy
Privacy as a fundamental Human Right

 Because it is a fundamental human right, recognized in the UN


Declaration of Human Rights, the Charter of Fundamental Rights of
the EU, et al.

 “No one shall be subjected to arbitrary interference with his privacy, family, home or
correspondence, nor to attacks upon his honour and reputation. Everyone has the
right to the protection of the law against such interference or attacks.”

• Individually, it is necessary for the development of persons’ autonomy,


self-ownership and self-growth
• Socially, it is thus necessary for healthy democracy
Prevention
of Harm
Having private information
become public may cause
actual harm
Necessary for intimacy and social
relationships
• Some argue that without privacy (or at least control over it), one couldn’t
have intimacy

• That is, without privacy, we couldn’t develop into moral, social persons
that are capable of entering into relationships with other people that are
characterized by trust, love, respect, friendship,…

• “Indeed, love, friendship and trust are only possible if persons enjoy
privacy and accord it to each other […] Privacy allows one the freedom to
define one’s relations with others and to define oneself. In this way,
privacy is also closely connected with respect and self respect.” [1]

• “Privacy accords us the ability to control who knows what about us and
who has access to us, and thereby allows us to vary our behavior with
different people so that we may maintain and control our various social
relationships, many of which will not be intimate.” [1]
What do we want to keep private?

• Physical privacy: the freedom to, within a given space, not be seen,
heard and/or touched by others

• Decisional privacy: the exclusion of others from decision that


properly belong to the individual and others close to them (e.g.,
relationship or healthcare decisions)

• Informational privacy: the freedom from having others receive


information about you without your consent

All of these can be violated through Smart Products


Anita Allen (1987) proposed a fourth:

• Dispositional privacy: restricts the capacity of others to gather


knowledge of and insight into one’s state of mind
Mental state broadcasting on car: Traffic safety is a collective responsibility. Part of
• E,g, emotion recognition technology keeps being further
this takes the form of anticipating other drivers’ behaviour behind the wheel.
developed and can be applied to different kinds of data input
Behavioural science indicates that our mental state significantly affects driving style
and consistency. To allow for other road users to better anticipate drivers, car
manufacturers are proposing ‘mental state broadcasting’, whereby in-car cameras
and biosensors read the driver’s mental state (e.g., frustrated, distracted, sad,
overjoyed, pensive, concentrated, miserable, etc.) and are broadcasted through
symbols and colours on the outside of the car that are clearly visible to other road
users
 Smart Products and IT can be used to harm informational
privacy, but also accessibility and decisional privacy.

 Personal information becomes more easily


available because of smart
products/infrastructures and can more easily be
distributed or leaked
Smart Products
and Privacy  Smart Products can also be used to monitor and
track individuals (surveillance), and to harass or
interfere with them (informational and physical
privacy)

 Smart Products makes it easier to collect


information about individuals without their
knowledge, and to combine different information
sources
 Lawfulness, fairness and transparency
Processing must be lawful, fair, and transparent to the
data subject.
 Purpose limitation
You must process data for the legitimate purposes
specified explicitly to the data subject when you
collected it.
 Data minimization
You should collect and process only as much data as
absolutely necessary for the purposes specified.
Seven core  Accuracy

principles 
You must keep personal data accurate and up to date.
Storage limitation
You may only store personally identifying data for as
long as necessary for the specified purpose.
 Integrity and confidentiality
Processing must be done in such a way as to ensure
appropriate security, integrity, and confidentiality (e.g.
by using encryption).
 Accountability
The data controller is responsible for being able to
demonstrate GDPR compliance with all of these
principles.
Privacy in public

 Privacy as “contextual integrity” (Helen Nissenbaum)

 Every arena of life is governed by


”norms of information flow”.
 These provide a baseline as to what one can expect in
terms of privacy in a specific situation/space

 Being in a public place does not imply that ‘anything


goes’ in terms of collection and storage of personal
information
Privacy in public

 Even in the public space, there are:

A. Norms of appropriateness: which information is


appropriate to disclose/acquire in a given context

 e.g., just because you are driving on a public road, this


doesn’t mean that your name, sexual orientation, medical
history or address or profession should be made public to
other road users (even though at least most of these are
known by some others)
Privacy in public

 Even in the public space, there are:

B. Norms of distribution: which information, once available


to someone, may be transferred to whom

 E.g. even assuming that someone may have access to your


name, address etc., e.g., a policewoman who stops you
for a routine control, this doesn’t mean that she can share
this information with a private company for commercial
purposes
Contextual Integrity in
smart tech practice  What information is
okay for examiners to
collect about students
during an exam?
 Who can the
information be shared
with?

 Think about different


kinds of
information/privacy:
 Physical
 Decisional
 Informational
 Dispositional
Today’s lecture
 Ethical challenges of smart products: examples
 Value 1: Justice
 Costs, benefits and justice
 Distributive justice
 Procedural justice
 Value 2: Inclusiveness
 Inclusiveness as accessibility
 Inclusiveness as representation
 Value 3: Privacy
 Why privacy is important
 Kinds of privacy
 Privacy as contextual Integrity (a baseline for design)

Goal of today’s lecture: to gain conceptual understanding of central values, and of


how they affect the design of (smart) products
VALUES FOR SMART PRODUCTS
Lecture 3 RESTS
May 15th, 2023
Dr. Jan Peter Bergen

You might also like