Professional Documents
Culture Documents
ERS
N IBLE
G IVE
E LY
FRE
Building
Consentful
Tech IN
FO
RM
ED
IC
ST
SIA
T HU
EN
SP
EC
IFI
C
3 What does consent have to do with
technology?
4 Content Warning
5
Consentful technology is about
having control over our digital
bodies
2017
10
Consent makes technology more
just and equitable You are free to:
23
Consentful technology moves us Attribution — You must give
appropriate credit, provide a link
towards consent culture to the license, and indicate if
changes were made. You may do
so in any reasonable manner, but
24 Resources not in any way that suggests the
licensor endorses you or your
24 Acknowledgments use.
But what about consent beyond our physical bodies? These days, we also have
digital bodies. Digital bodies are made up of pieces of personal data. Like our
physical bodies, our digital bodies exist in relationship with others and can
participate in communities. They can also experience harm. Although the harm
to them might not be physical, our digital bodies are frequently acted upon in
non-consensual ways:
IN
FO
RM
ED
Who profits?
10 BUILDING CONSENTFUL TECH
Who is harmed?
In a lot of cases, you’ll find that those who might experience harm such as
There are many ways to make technology more just and equitable, and consent
is one important consideration. Non-consentful features and interactions can
be minor nuisances for some people, but can be very harmful to others. When
Facebook introduced photo tagging, anyone could tag you in a photo, whether
or not you were okay with it. For some users, that could lead to embarrassment
if the photo wasn’t particularly flattering. But for other people, the harm could
be much more serious. For trans users, tagging photos from their pre-transition
lives without their consent could lead to them being outed, which can have
consequences for employment, housing, safety, and more.
When technology is built without asking these questions from the beginning,
serious harm can happen. It often takes multiple instances of harm for a patch
to be designed. Popular photo and video sharing platforms, for instance, have
been used to circulate images of acts of sexual violence, which re-harms people
who have experienced violence and perpetuates violence in our culture. It is
extremely difficult to delete these images once they have been distributed.
These are obviously not the use cases the developers and owners of these
platforms were intending, but they do illustrate the harm that can happen
when we fail to design with consent in mind from the ground up, and
foreground the concerns of users who could be severely and unjustly impacted.
If we can’t make it safer then we can acknowledge the remaining risks and
educate users about them. Let’s build industry standards for reporting risks at
rest, in transit and during processing. Additional standards for functionality-
based risk would help too. This is a big problem: it requires software
developers, industry groups, advocates and users all working together, and it
starts by having this conversation. Let’s talk about how risky it currently is to
use software, and how we can make it safer and more accessible for everyone.
bar for security, but also helps with scaling: if everyone is sharing some of the
processing, the servers can do less work. In this model your digital body is no
longer spread throughout servers on the internet; instead the applications come
to you and you directly control how they interact with your data.
1 https://www.cis.upenn.edu/~aaroth/Papers/privacybook.pdf
2 https://crypto.stanford.edu/craig/craig-thesis.pdf
3 http://scuttlebot.io
Consent is an ongoing
DIGITAL COMMUNITIES
More and more, our digital bodies exist in digital networks and communities.
Whenever there are multiple relationships between people, a type of
community or network is created. One type is formed when people sign up
for a service — as users, they are now in relationship with whoever owns
and works on the technology. Another type of community is created when
users interact with each other. Digital communities can overlap with physical
communities.
• reate and affirm values & practices that resist abuse and oppression and
C
encourage safety, support, and accountability
• Develop sustainable strategies to address community members’ abusive
behavior, creating a process for them to account for their actions and
transform their behavior
• Commit to ongoing development of all members of the community, and
the community itself, to transform the political conditions that reinforce
oppression and violence
• Provide safety & support to community members who are violently
targeted that respects their self-determination1
change the way that the creators of algorithms are held accountable for the
harms that their biases cause?
1 http://www.incite-national.org/page/community-accountability
DIGITAL BODIES ARE
Small changes can make a big difference when we add a little friction to
pathways used for abusive behaviour, and when we make it easier for people
to help each other. For example, new users might have a quieter voice until
they’ve been around awhile, or messages mentioning you could be downvoted
by your friends so you won’t see them.
22 BUILDING CONSENTFUL TECH
Consentful technology
The cost of interacting with technology securely is quite high, and those least
able to pay that cost are also those most at risk of harm when things break.
Just as we should not blame survivors for sexual violence, we must not place
the burden of safety on users in terms of who is responsible and who suffers
the consequences.
INCITE!
http://www.incite-national.org
24 BUILDING CONSENTFUL TECH
Troll Busters
https://www.troll-busters.com
Who wrote this?
FEEDBACK?
We welcome your comments on this
zine. To send feedback, please visit
http://bit.ly/2yXQaZy
Acknowledgments
This zine would not exist without funding and moral
support from Allied Media Projects and the Mozilla
Foundation.
V EN
GI
E LY
F RE
IN
FO
IC RM
IA ST ED
H US
E NT
SP
EC
IFI
C