Professional Documents
Culture Documents
TOWARDS REFUSING AS
A CRITICAL TECHNICAL
PRACTICE:
STRUGGLING WITH
HEGEMONIC COMPUTER
VISION
Abstract
from, and thus requires radical forms of antagonism. The goal of this article is
to sketch how refusing CV can be part of a counter-hegemonic practice – be
it the refusal to work or other, more creative, responses. The article begins by
neutral and impartial, while ignoring its wide application for surveillance. Then,
it discusses the emergent notion of refusal, and why critical technical practice
can be a useful framework for questioning hegemonic sociotechnical systems.
Finally, several potential paths for refusing hegemonic CV are outlined by en-
gaging with different layers of the systems’ ‘stack.’
CC license: ‘Attribution-NonCommercial-ShareAlike’.
Gabriel Pereira: TOWARDS REFUSING ...
The computer scientist Joseph Redmon, of making everything better, faster, and
creator of the widely-used Computer Vision more innovative. The techno-utopian and
library YOLO, announced in 2020 he would techno-solutionist discourse on CV pushed
no longer be developing the algorithm he by Silicon Valley companies and other tech/
created. The reason for this is made clear in government entities presents these tech-
a paper he co-wrote on the new features of
YOLO:
automation. In reality, CV operates under the
But maybe a better question is: “What goal of identifying and naming, classifying
are we going to do with these detec- and quantifying, and generally organizing
tors [Computer Vision algorithms] now the visual world to support surveillance, be
that we have them?” A lot of the people it military or commercial. This hegemonic
doing this research are at Google and paradigm of Computer Vision forms a “com-
Facebook. I guess at least we know
the technology is in good hands and break from, and thus requires radical forms
of antagonism, refusal, and resistance.
personal information and sell it to... My goal in this article is to sketch a
wait, you’re saying that’s exactly what scenario of refusing as a reaction to hegem-
it will be used for?? Oh. Well the other onic CV, in order to help readers engage in
people heavily funding vision research their own practices of antagonism – be it
are the military and they’ve never done the refusal to work as shown by Redmon or
anything horrible like killing lots of other, more creative, responses. I particularly
people with new technology oh wait.... engage with the notion of refusing because,
(Redmon and Farhadi 4) as seen in Redmon’s example, it shifts the
discussion away from a reform of technical
This humorous – yet critical – paragraph is character and towards a more radical coun-
crowned by a footnote, which states: “The terhegemonic practice. Towards this goal, I
Research and Google.” Redmon’s refusal as hegemonic CV, the ‘common sense’
to continue his work with Computer Vision that frames machine seeing as neutral and
(CV) – algorithms of image analysis and impartial, while ignoring its wide application
recognition – builds upon his realization of for surveillance. Afterwards, I discuss the
the contribution he makes, as a computer notion of refusal, and why I think critical tech-
scientist, to algorithms of oppression (see nical practice serves as a useful framework
Noble; Ochigame) and their nefarious for questioning hegemonic sociotechnical
impacts in society (e.g. consumer surveil- systems. Finally, I outline several paths for
lance, drone attacks, tracking of migrants by refusing hegemonic CV by engaging with
governments). Rather than trying to reform different layers of its stack. These potential
resistance acts, as I show, can take shape in
varied forms – artistic projects, activist initia-
refuse, to deny his labor as part of develop- tives, and community organizing can all offer
ing such technology.[1] counterhegemonic pathways for CV.
Redmon’s refusal throws a wrench
in the system, breaking from the hegem-
onic presentation of CV (and AI) as a way
31
APRJA Volume 10, Issue 1, 2021
32
Gabriel Pereira: TOWARDS REFUSING ...
33
APRJA Volume 10, Issue 1, 2021
a Luddite approach sees technology as a site here on just one of them: the concept of
of struggle in which antagonism is necessary “critical technical practice” (referred here as
to challenge hegemonic goals and assump- CTP). In his seminal text conceptualizing the
tions – subverting technological control to term, “Toward a Critical Technical Practice:
regain power for workers. Mueller’s analysis Lessons Learned in Trying to Reform AI,”
Phil Agre recognizes “computing has been
part of a generative politics – not only break- constituted as a kind of imperialism; it aims to
ing machines and sabotage, but also forms reinvent virtually every other site of practice
of struggle such as that over policy and in its own image” (131). Agre’s CTP emerges
legislation. It means not only to say no to the from his own personal story as a computer
development of new technical systems, but scientist that only after some time began to
also to actively envision how we can center realize the political, social, and cultural con-
other values and paradigms. stitution of technology, therefore suggesting
All through industry, activism, policy, how practice should avoid the separation
and research there are increased calls that between computer science and critical
center rejection (saying “no”) as a strategy
to combat the problems caused by algorithm Practitioners would work interdisciplinarily,
development and deployment in society. “one foot planted in the craft work of design
Just to cite a few recent calls: the “Feminist
Data Manifest-No” proposes 10 points of work of critique,” in order to create alterna-
refusal for harmful data regimes; Seeta Peña
Gangadharan discusses people’s unwilling- technical, cultural, and interpersonal object”
ness “to accept data-driven systems in the (Harwood 32).
terms and conditions that government or The problem I see in the conceptualiza-
private actors present to us” (3); Sarah tion of CTP brought by Agre is that, as it was
Hamid, in a remarkable interview, argues created over 20 years ago, it is somewhat
for abolishing “carceral technologies,” and detached from the current historical mo-
organizing “against the logics, relationships,
and systems that scaffold their existence”; refusing major revolutions and believing that
and Chelsea Barabas suggests tech design- change can be brought through individual
ers should “turn down requests and opportu- practitioners and their “intrapersonal” critical
nities to build technologies that are likely to consciousness. The notion of CTP has been
produce harm.” All of these pleas are located further adapted through the past two decades,
in a wider global environment in which tech as scholars/practitioners have spun it into a
workers have been putting pressure on com-
panies’ unethical developments, as shown in design (e.g. Phoebe Sengers, Matt Ratto,
the case of Google workers’ refusal to build Garnet Hertz, Alan Blackwell, Shaowen and
Project Maven and the Tech Won’t Build It Jeffrey Bardzell) or artistic practice (e.g.
activist group. YoHa and Matthew Fuller, Winnie Soon).
The goal of refusing is compelling, and
these initiatives make visible how it is both conceptualization for thinking of technologi-
cal practice as a mode of critique – making
of antagonizing hegemonic technological computation as a way of understanding and
systems. There are certainly many ways re-thinking computation itself.
such disposition can operate, and I’ll focus
34
Gabriel Pereira: TOWARDS REFUSING ...
35
APRJA Volume 10, Issue 1, 2021
of the whole. One attempt at looking at the et al). The Ordinance sought to involve
‘stack’ of CV has been suggested by Azar, citizens in approving or rejecting
Cox, and Impett, a “vertical cartography” emergent technology uses, such as
comprised of six different levels (10). While Automated License Plate Readers
their original goal was organizing critical (ALPR). By creating the possibility
of curtailing the operation of these
to use these layers to envision sites of refus- systems, or at least exposing them,
ing interventions by artists, researchers, and the Seattle Ordinance is a major
activists. For this, I both point to important example of how refusal can take place
in a policy, activism, and community
still remain to be intervened on. I hope, with organizing sense.
this, to offer more questions than answers Here also lies the discussion on the
and suggest critical pathways for a counter- workers behind CV algorithms. What
hegemonic practice: kinds of actions are possible to allow
them to refuse creating certain technol-
(1) Social level (where are such ogies? Could CV developers refuse the
systems deployed, by whom, for what use of their code by the military and big
purpose) tech companies? The case of Redmon,
discussed in the introduction, is just
At this level, it’s possible to critique and one of many attempts by tech workers
reject systems that are being deployed,
as well as imagine alternative techno- labor, a practice in which activists play
logical formations. How can practition- a major role (Mueller). However, how
ers expose the nefarious impacts of can practitioners also involve other
CV, including the way these systems workers, that are often not given much
concentrate power and affect the power in the system, the possibility of
most marginalized? For example, the antagonizing the development of CV?
Coveillance project (coveillance.org), (see e.g. the work of xtine burrough
aims to map surveillant technologies with Amazon Mechanical Turkers).
in the city space, creating workshops
such as “A walking tour of surveillance (2) Computational level (which
infrastructure in Seattle,” in which problems are being solved: e.g. ‘object
the deployment of smart cameras detection’)
are discussed by organizers and the
public. Refusing CV’s hegemonic ideology of
What technologies shouldn’t be used surveillance and tracking, could other
at all? And how can practitioners act ‘problems’ be solved? Hegemonic
on the creation of alternative institu- CV focuses mostly on detecting and
tions for this emergent regulation and
policy of technology? A particularly oriented parameters. One possibility
powerful example of this is the Seattle is the use of CV from a disobedient
Surveillance Ordinance, which has gaze, “surveilling the most powerful,
sought to create new systems for the as opposed to those marginalized”
regulation of technologies by involving (Pereira 154; see also Barabas et
the affected communities (Lee; Young al.). An example of this is VFRAME,
36
Gabriel Pereira: TOWARDS REFUSING ...
by Adam Harvey and Jules LaPlace, ways that are more just, operating
which seeks to create AI/CV tools other ways of collecting, curating,
for human rights uses, including a and organizing images? The project
“Munition Detector” which could locate Feminist Data Set, by researcher and
illegal munitions and support the work artist Caroline Sinders asked this
of activists. question, and through the period of
What if, instead of the obsession for many years has been holding work-
solving problems, CV focused on shops and forums to collaboratively
investigate these questions in the
of these systems both as a sign of case of a chat bot. The outcome of
their limitations and as a way to think this project hasn’t been (and won’t
otherwise” (Pereira 156)? This form of be) an ultimate response, but different
refusal suggests the embrace of error, tools and examinations on what data
indeterminacy, opacity, and situated- sets could look like if data sets were
ness instead of the solution of bias and thought from a feminist lens.
partiality (Amoore). This means using It’s important to also consider how im-
algorithmic ways of seeing as ways age data sets are themselves partial,
for exploring alternative visibilities
and to create new connections that could/should be included in the crea-
couldn’t be otherwise (see Pereira and tion of CV. What becomes possible if,
Moreschi for an example of using CV for example, there wasn’t an expecta-
to look at artworks through the lens of tion of bigness in data sets, with a
error). practice focusing on small data? (see
Prosthetic Memory
(3) Data level (who labels, which for an example of a custom-made
images are chosen, who takes the one-person machine learning tool).
photographs)
collecting images/data is refused, and
How can we refuse problematic data instead CV is trained on computation-
sets and their troubled histories? Vinay ally manufactured image data? Could
Prabhu and Abeba Birhane even that open a way of not even needing to
name computer vision a pyrrhic win collect people’s image data at all? (see
due to the “problematic practices and e.g. Harvey’s VFRAME project).
consequences of large scale vision
datasets” (1). Data sets often rely (4) Algorithmic/representational level
on the extraction of images without (e.g. Siamese convolutional neural
agreement from users (see Harvey & network with Adam gradient descent
LaPlace; Thylstrup), use precarious optimization)
labor of Amazon Mechanical Turkers
(Irani, “Justice for ‘Data Janitors’”), as Though much discussion critical of
well as orgazine seeing through labels hegemonic CV has tended to focus
that encode racist, classist, and sexist on the “data problem” (Hooker), the
histories (Hanna et al.; Crawford and
Paglen; Smits and Wevers). How can ways of seeing are possible, what data
designers instead create datasets in are valued, and the particular modes
37
APRJA Volume 10, Issue 1, 2021
38
Gabriel Pereira: TOWARDS REFUSING ...
in collective resistance (much like the of refusing are possible, and how they can
Luddites). support your practice.
A goal for practice then needs to be the
much wider change on how data and
algorithms are rationalized, which can
Conclusion: Performing
only happen through alternative institu-
tions and networks of practice. Though alternatives, bearing
not focused on CV, Sabelo Mhlambi witness to limitations
has written on how Ubuntu philoso-
phies could change AI’s paradigms. He
suggests AI could operate away from When the computer scientist Joseph
its dominant culture of personhood
based on rationality and individualism. of Computer Vision, due to his perception of
The Ubuntu framework for personhood the harms and injustices these technologies
is based on relationality and different were causing, he threw a wrench in the sys-
principles: “solidarity, reconciliation, tem. Hegemonic Computer Vision advances
equality, equity, and community” (15). these algorithms as natural developments,
What would it mean to, in fact, put neutral and impartial operations, even though
these into practice as guiding notions they’re mostly used for supporting multiple
in CV policy, for example? forms of surveillance. In this article, I’ve dem-
onstrated how refusing is a powerful counter-
Considering refusal as part of this wider hegemonic stance to this ‘common sense,’
stack serves to argue that counterhegem- especially through personal and collective
onic responses can happen across different antagonism to uncontrolled technological
areas, even in places that so far remain little development. Bridging this stance with criti-
acknowledged. It shows how refusal can cal technical practice’s focus on developing
have many different facets and intensities -
beyond the work of computer scientists like nical can serve to perform alternatives and
Redmon. It importantly highlights the radical pave the way to radical reimaginations – or at
work being done by activists, artists, com- least create some ‘band-aids’ that bear wit-
munity organizers, researchers, etc. While ness to how much work we still need to do.
useful, this stack is just one of many possible We needn’t only reform CV, but to de-
cartographies of action. The Stop LAPD part from an active refusal of its ideology and
Spying activist group, for example, suggests organizations, completely breaking it apart in
a framework for mapping surveillance divided pieces before building something new. Our
on different layers: “Ideological, Institutional, -
Operational, and Community.” Such a way of tions, but to refuse the unsettling ‘common
mapping illuminates how practice also needs sense’ of technological progress through
to consider how communities should be in- action. Much like the Luddites, refusing to
volved in the creation and use of CV. Which work and sabotaging, but also building new
communities should be centered that haven’t ways of seeing and generating new collec-
yet, and which are being marginalized by CV tive engagements with the visual world.
use? This essay only hopes to serve as a
starting point, so I leave it to you, the reader, to
consider what other ‘counter-cartographies’
39
APRJA Volume 10, Issue 1, 2021
40
Gabriel Pereira: TOWARDS REFUSING ...
Crawford, Kate, and Trevor Paglen. Hooker, Sara. “Moving beyond ‘Algorithmic
Excavating AI: The Politics of Training Sets Bias Is a Data Problem.’” Patterns, vol. 2,
for Machine Learning. 2019, https://excavat- no. 4, Apr. 2021, p. 100241. ScienceDirect,
ing.ai/. doi:10.1016/j.patter.2021.100241.
Crehan, Kate A. F. Gramsci’s Common Irani, Lilly. “Justice for ‘Data Janitors.’”
Sense: Inequality and Its Narratives. Duke Public Books, 15 Jan. 2015, https://www.
University Press, 2016. publicbooks.org/justice-for-data-janitors/.
D’Ignazio, Catherine, and Lauren F. Klein. ---. “The Cultural Work of Microwork.” New
Data Feminism. MIT Press, 2020. Media & Society, vol. 17, no. 5, 2015, pp.
720–39.
Gangadharan, Seeta Peña. Technologies
of Control and Our Right of Refusal. Laclau, Ernesto, and Chantal Mouffe.
TEDx London. 2019, http://eprints.lse. Hegemony and Socialist Strategy: Towards
ac.uk/101157/4/Gangadharan_technolo- a Radical Democratic Politics. 2nd ed,
gies_of_control_remarks.pdf. Verso, 2001.
41
APRJA Volume 10, Issue 1, 2021
Sadowski, Jathan. Too Smart: Digital Tubaro, Paola, et al. “The Trainer, the
Capitalism Is Extracting Data, Controlling
Our Lives, and Taking Over the World. MIT
Press, 2020. Intelligence.” Big Data & Society, vol. 7, no.
1, 2020, doi:10.1177/2053951720919776.
42
Gabriel Pereira: TOWARDS REFUSING ...
43