You are on page 1of 14

Gabriel Pereira

TOWARDS REFUSING AS
A CRITICAL TECHNICAL
PRACTICE:
STRUGGLING WITH
HEGEMONIC COMPUTER
VISION
Abstract

In reality, hegemonic CV is a particular way of seeing that operates under the


goal of identifying and naming, classifying and quantifying, and generally organ-
izing the visual world to support surveillance, be it military or commercial. This

from, and thus requires radical forms of antagonism. The goal of this article is
to sketch how refusing CV can be part of a counter-hegemonic practice – be
it the refusal to work or other, more creative, responses. The article begins by

neutral and impartial, while ignoring its wide application for surveillance. Then,
it discusses the emergent notion of refusal, and why critical technical practice
can be a useful framework for questioning hegemonic sociotechnical systems.
Finally, several potential paths for refusing hegemonic CV are outlined by en-
gaging with different layers of the systems’ ‘stack.’

APRJA Volume 10, Issue 1, 2021


ISSN 2245-7755

CC license: ‘Attribution-NonCommercial-ShareAlike’.
Gabriel Pereira: TOWARDS REFUSING ...

The computer scientist Joseph Redmon, of making everything better, faster, and
creator of the widely-used Computer Vision more innovative. The techno-utopian and
library YOLO, announced in 2020 he would techno-solutionist discourse on CV pushed
no longer be developing the algorithm he by Silicon Valley companies and other tech/
created. The reason for this is made clear in government entities presents these tech-
a paper he co-wrote on the new features of
YOLO:
automation. In reality, CV operates under the
But maybe a better question is: “What goal of identifying and naming, classifying
are we going to do with these detec- and quantifying, and generally organizing
tors [Computer Vision algorithms] now the visual world to support surveillance, be
that we have them?” A lot of the people it military or commercial. This hegemonic
doing this research are at Google and paradigm of Computer Vision forms a “com-
Facebook. I guess at least we know
the technology is in good hands and break from, and thus requires radical forms
of antagonism, refusal, and resistance.
personal information and sell it to... My goal in this article is to sketch a
wait, you’re saying that’s exactly what scenario of refusing as a reaction to hegem-
it will be used for?? Oh. Well the other onic CV, in order to help readers engage in
people heavily funding vision research their own practices of antagonism – be it
are the military and they’ve never done the refusal to work as shown by Redmon or
anything horrible like killing lots of other, more creative, responses. I particularly
people with new technology oh wait.... engage with the notion of refusing because,
(Redmon and Farhadi 4) as seen in Redmon’s example, it shifts the
discussion away from a reform of technical
This humorous – yet critical – paragraph is character and towards a more radical coun-
crowned by a footnote, which states: “The terhegemonic practice. Towards this goal, I

Research and Google.” Redmon’s refusal as hegemonic CV, the ‘common sense’
to continue his work with Computer Vision that frames machine seeing as neutral and
(CV) – algorithms of image analysis and impartial, while ignoring its wide application
recognition – builds upon his realization of for surveillance. Afterwards, I discuss the
the contribution he makes, as a computer notion of refusal, and why I think critical tech-
scientist, to algorithms of oppression (see nical practice serves as a useful framework
Noble; Ochigame) and their nefarious for questioning hegemonic sociotechnical
impacts in society (e.g. consumer surveil- systems. Finally, I outline several paths for
lance, drone attacks, tracking of migrants by refusing hegemonic CV by engaging with
governments). Rather than trying to reform different layers of its stack. These potential
resistance acts, as I show, can take shape in
varied forms – artistic projects, activist initia-
refuse, to deny his labor as part of develop- tives, and community organizing can all offer
ing such technology.[1] counterhegemonic pathways for CV.
Redmon’s refusal throws a wrench
in the system, breaking from the hegem-
onic presentation of CV (and AI) as a way

31
APRJA Volume 10, Issue 1, 2021

Hegemonic CV: The The particular lens of the hegem-


onic Computer Vision today is made of many
limited ways of seeing of ideological decisions over how the visual
surveillance, advertisement, world should be understood and processed,
including the ontology and epistemology that
and the military should be used in this process (see Azar et
al.). What I call hegemonic CV is the domi-
Who makes Computer Vision algorithms? nant paradigm of automated ways of seeing,
How are they being trained to see? What is directly connected to surveillance, both mili-
made visible through these algorithmic ways tary and commercial (e.g. automated military
of seeing, and what is otherwise ignored? drones and biased proctoring systems). This
The answer to these and many other ques- paradigm is not directly stated or enforced,
tions points to the often-ignored sociotechni- operating through consent and culture rather
cal complexity involved in the widespread than force. Much as described by Gramsci,
adoption of CV. As they get implemented and later extended by Laclau and Mouffe,
in smart cameras or automated cars, these hegemonic ideological formations are pro-
algorithms carry within them not the capacity duced and negotiated as the outcome of
to ‘see,’ but that of making judgements over constant struggles for power, emerging from
what in the visual world should be seen and a wider cultural/social history. What’s crucial
how. is that they get sedimented as a “common
CV’s algorithmic power emerges sense” (Gramsci),[2] an “accumulation of
from how, through their affordances and taken-for-granted ‘knowledge’” (Crehan 43).
- These collections of beliefs and ideas are
“not a single unique conception, identical in
Matteo Pasquinelli and Vladan Joler com- time and space,” (Gramsci 343) but fragmen-
pare algorithms to lenses: “Instruments of tary and contradictory, “a product of history
measurement and perception always come and a part of the historical process” (327).
with inbuilt aberrations. In the same way that
the lenses of microscopes and telescopes to imagine alternative lenses to see the
are never perfectly curvilinear and smooth, world: as hegemonic CV further entrenches
the logical lenses of machine learning em- itself in our lives, our human ways of seeing
body faults and biases” (2). This is much and wider societal processes are changed
similar to Amoore’s framing of algorithms (see Cox).
as “aperture instruments,” thus suggesting CV is just one of the many data capital-
that it is through “the processes of feature ist/colonialist algorithmic operations through
extraction, reduction, and condensation” that which value is extracted from appropriating
“algorithms generate what is of interest in the people’s data, be it their face, their pictures,
data environment” (16). These analogies to or other visual material (Couldry and Mejias).
other instruments of perception are useful The main imperatives of ‘smart technologies’
because they help understand contemporary such as hegemonic CV is extracting more
CV as one possible lens, with many other data from all sources possible, while also
possibilities. Algorithmic models are always “creating systems that monitor, manage, and
imperfect and biased towards something – manipulate the world and people” (Sadowski
as Amoore puts it, they’re “always already 9). In Sarah Myers West’s words, this opera-
partial accounts” (20). tion is marked by a change in power relations:

32
Gabriel Pereira: TOWARDS REFUSING ...

“Access to data, and the ability to transform As described by Raymond Williams,


raw data into useful information, is asym-
metrical, and power lies in the institutions It can be persuasively argued that all
with the technical and economic resources or nearly all initiatives and contribu-
to render it intelligible” (37). Hegemonic CV tions, even when they take on mani-
is based on these structural conditions and festly alternative or oppositional forms,
built upon their limitations, thus operating in are in practice tied to the hegemonic:
order to meaningfully limit who gets to see that the dominant culture, so to say, at
and who is seen – making itself into a crucial once produces and limits its own forms
site of centralized and unequal surveillance of counter-culture. (114)
and data extraction.
Hegemonic CV intentionally focuses The conceptualization of hegemony en-
- ables thinking of our practice as part of wider
tion from its unequal power structures and struggles for re-constituting these systems.
the many problems involved in its limited The notion of refusing departs from under-
perception of the visual world. Hegemonic standing that counterhegemonic struggles
CV presents itself as objective, hiding its are responses constructed in the interstices
deep commitments to military and surveillant of hegemonic forms. That is, even though we
ways of seeing formed by Western, white, may try to re-imagine CV, we’re still located
and capitalist frames (Silva; Silva et al.; in relation to this dominant system.
Buolamwini and Gebru; Mirzoeff; Pereira
and Moreschi). As scholars have demon-
strated time and again, supporting these
Why a Critical Technical
structural inequalities are, among other is-
sues, exploitative labor practices (Tubaro et Practice, and the case for
al.; Irani “The Cultural Work of Microwork”) refusing as a verb
and problematic data sets (Thylstrup; Harvey
and LaPlace; Prabhu and Birhane; Crawford
and Paglen). Furthermore, there are many Redmon’s refusal to continue working in
intended and unintended, known and un- Computer Vision uses his privileged position
known, consequences of Computer Vision, to throw a wrench in the gears of hegemonic
which are mostly ignored in exchange for CV, helping to both expose these harmful
technologies and delay their development.
Wilken). The book “Breaking Things at Work” by the
Despite the many ways CV could have scholar Gavin Mueller presents a long vision
been formed, my argument is that a ‘com- of how workers, not unlike Redmon, have for
mon sense’ has emerged that frames CV in long resisted the expansion of automation,
a particular way that’s not just, equitable, and seeing these “new machines as weapons
wielded against them in their struggles for a
algorithmic imaginaries perform a “discursive better life” (e-book). Mueller suggests how
closure,” cutting off alternatives that seek to Luddism – multiple forms of collective resist-
work in a different way (3). Understanding the ance to uncontrolled technological develop-
existence of a hegemonic CV allows to better ment – can form a decelerationist political
think about possible oppositions, resistances, project to challenge the continuous develop-
and alternatives to its outsized hegemony. ment and deployment of technologies. Such

33
APRJA Volume 10, Issue 1, 2021

a Luddite approach sees technology as a site here on just one of them: the concept of
of struggle in which antagonism is necessary “critical technical practice” (referred here as
to challenge hegemonic goals and assump- CTP). In his seminal text conceptualizing the
tions – subverting technological control to term, “Toward a Critical Technical Practice:
regain power for workers. Mueller’s analysis Lessons Learned in Trying to Reform AI,”
Phil Agre recognizes “computing has been
part of a generative politics – not only break- constituted as a kind of imperialism; it aims to
ing machines and sabotage, but also forms reinvent virtually every other site of practice
of struggle such as that over policy and in its own image” (131). Agre’s CTP emerges
legislation. It means not only to say no to the from his own personal story as a computer
development of new technical systems, but scientist that only after some time began to
also to actively envision how we can center realize the political, social, and cultural con-
other values and paradigms. stitution of technology, therefore suggesting
All through industry, activism, policy, how practice should avoid the separation
and research there are increased calls that between computer science and critical
center rejection (saying “no”) as a strategy
to combat the problems caused by algorithm Practitioners would work interdisciplinarily,
development and deployment in society. “one foot planted in the craft work of design
Just to cite a few recent calls: the “Feminist
Data Manifest-No” proposes 10 points of work of critique,” in order to create alterna-
refusal for harmful data regimes; Seeta Peña
Gangadharan discusses people’s unwilling- technical, cultural, and interpersonal object”
ness “to accept data-driven systems in the (Harwood 32).
terms and conditions that government or The problem I see in the conceptualiza-
private actors present to us” (3); Sarah tion of CTP brought by Agre is that, as it was
Hamid, in a remarkable interview, argues created over 20 years ago, it is somewhat
for abolishing “carceral technologies,” and detached from the current historical mo-
organizing “against the logics, relationships,
and systems that scaffold their existence”; refusing major revolutions and believing that
and Chelsea Barabas suggests tech design- change can be brought through individual
ers should “turn down requests and opportu- practitioners and their “intrapersonal” critical
nities to build technologies that are likely to consciousness. The notion of CTP has been
produce harm.” All of these pleas are located further adapted through the past two decades,
in a wider global environment in which tech as scholars/practitioners have spun it into a
workers have been putting pressure on com-
panies’ unethical developments, as shown in design (e.g. Phoebe Sengers, Matt Ratto,
the case of Google workers’ refusal to build Garnet Hertz, Alan Blackwell, Shaowen and
Project Maven and the Tech Won’t Build It Jeffrey Bardzell) or artistic practice (e.g.
activist group. YoHa and Matthew Fuller, Winnie Soon).
The goal of refusing is compelling, and
these initiatives make visible how it is both conceptualization for thinking of technologi-
cal practice as a mode of critique – making
of antagonizing hegemonic technological computation as a way of understanding and
systems. There are certainly many ways re-thinking computation itself.
such disposition can operate, and I’ll focus

34
Gabriel Pereira: TOWARDS REFUSING ...

Much as these developments have poorly enough to be dangerous as well.”


been important, I suggest CTP now needs Finally, ‘refusing’ as a lens for a criti-
to be further updated with contemporary cal technical practice is not a negative, but
notions of collective resistance, as well as a generative proposition – therefore always
more radical forms of antagonism and resist- a verb indicating a continuous struggle. It’s
ance. Agre himself acknowledged that critical not just saying “no,” but understanding how
research that worked from inside the system technology is located within a wider array
– including his own – could be quickly ap- of historical, social, and cultural conditions.
propriated by the industry or the military. This It, drawing for the work of data feminism,
further suggests how the goal of reform (or needs to “name and challenge sexism and
critiquing from the inside) can end up being other forces of oppression, as well as… seek
innocuous, and reinforces the need for prac- to create more just, equitable, and livable
tices that defy institutionalization and that futures” (D’Ignazio and Klein 6). As the artist
more strongly reject the ‘common sense’. and researcher Caroline Sinders proposes,
As hegemonic CV has become particu- our practice needs to be “productive, as well
larly entrenched, my argument is to use CTP as provocative,” operating as “band aids”:
to engage with radical contemporary propos- “not meant to create an end to all other
als for refusing. Rather than simply building potential solutions, but serve to offer rather,
more stuff, or building better, this means
centering considerations about what not to equity and violence created by society and
build, and of more strongly thinking about are poetic witnesses of those gaps.” Refusing
building as a way of creating alternatives. The needs to be an interdisciplinary practice that
concept of “refusing” allows us to abandon aims to affect the world, moving beyond the
practitioner and attempting to disrupt hegem-
and instead supports a multifaceted, activist onic structures of power and make change.
disposition through different approaches, in- Refusing both breaks the system in operation
cluding arts, activism, community organizing, and creates alternative systems.
and research. Refusing as a practice is not
just for those who have coding or technical
skills. Much to the contrary, it needs to en-
gage with the stack of technological systems
Speculations on what
from their labor practices to their philosophi- refusing can mean across
cal underpinnings.
A practice that centers ‘refusing’ doesn’t
the ‘stack’ of hegemonic CV
necessarily mean not using AI or algorithms,
but breaking from its hegemonic paradigms I will now focus on the particular case of
to imagine how they could be different if Computer Vision, to highlight potential direc-
they could center marginalized perspectives. tions refusing as a critical technical practice
Refusing, importantly, is to go against the may take when engaging with such technolo-
tech/AI hype (Vinsel) and instead show how gies. CV, as any other algorithmic technol-
and when these technologies may not work as ogy, is tremendously complex, formed of
supposed. Moreover, and paraphrasing Agre many different interlocking social, cultural,
in Your Face Is Not a Bar Code, this means economic, and legal aspects. Any attempt at
understanding that surveillant technologies considering such wide scale systems is nec-
will “work well enough to be dangerous, and essarily partial, focusing only on some parts

35
APRJA Volume 10, Issue 1, 2021

of the whole. One attempt at looking at the et al). The Ordinance sought to involve
‘stack’ of CV has been suggested by Azar, citizens in approving or rejecting
Cox, and Impett, a “vertical cartography” emergent technology uses, such as
comprised of six different levels (10). While Automated License Plate Readers
their original goal was organizing critical (ALPR). By creating the possibility
of curtailing the operation of these
to use these layers to envision sites of refus- systems, or at least exposing them,
ing interventions by artists, researchers, and the Seattle Ordinance is a major
activists. For this, I both point to important example of how refusal can take place
in a policy, activism, and community
still remain to be intervened on. I hope, with organizing sense.
this, to offer more questions than answers Here also lies the discussion on the
and suggest critical pathways for a counter- workers behind CV algorithms. What
hegemonic practice: kinds of actions are possible to allow
them to refuse creating certain technol-
(1) Social level (where are such ogies? Could CV developers refuse the
systems deployed, by whom, for what use of their code by the military and big
purpose) tech companies? The case of Redmon,
discussed in the introduction, is just
At this level, it’s possible to critique and one of many attempts by tech workers
reject systems that are being deployed,
as well as imagine alternative techno- labor, a practice in which activists play
logical formations. How can practition- a major role (Mueller). However, how
ers expose the nefarious impacts of can practitioners also involve other
CV, including the way these systems workers, that are often not given much
concentrate power and affect the power in the system, the possibility of
most marginalized? For example, the antagonizing the development of CV?
Coveillance project (coveillance.org), (see e.g. the work of xtine burrough
aims to map surveillant technologies with Amazon Mechanical Turkers).
in the city space, creating workshops
such as “A walking tour of surveillance (2) Computational level (which
infrastructure in Seattle,” in which problems are being solved: e.g. ‘object
the deployment of smart cameras detection’)
are discussed by organizers and the
public. Refusing CV’s hegemonic ideology of
What technologies shouldn’t be used surveillance and tracking, could other
at all? And how can practitioners act ‘problems’ be solved? Hegemonic
on the creation of alternative institu- CV focuses mostly on detecting and
tions for this emergent regulation and
policy of technology? A particularly oriented parameters. One possibility
powerful example of this is the Seattle is the use of CV from a disobedient
Surveillance Ordinance, which has gaze, “surveilling the most powerful,
sought to create new systems for the as opposed to those marginalized”
regulation of technologies by involving (Pereira 154; see also Barabas et
the affected communities (Lee; Young al.). An example of this is VFRAME,

36
Gabriel Pereira: TOWARDS REFUSING ...

by Adam Harvey and Jules LaPlace, ways that are more just, operating
which seeks to create AI/CV tools other ways of collecting, curating,
for human rights uses, including a and organizing images? The project
“Munition Detector” which could locate Feminist Data Set, by researcher and
illegal munitions and support the work artist Caroline Sinders asked this
of activists. question, and through the period of
What if, instead of the obsession for many years has been holding work-
solving problems, CV focused on shops and forums to collaboratively
investigate these questions in the
of these systems both as a sign of case of a chat bot. The outcome of
their limitations and as a way to think this project hasn’t been (and won’t
otherwise” (Pereira 156)? This form of be) an ultimate response, but different
refusal suggests the embrace of error, tools and examinations on what data
indeterminacy, opacity, and situated- sets could look like if data sets were
ness instead of the solution of bias and thought from a feminist lens.
partiality (Amoore). This means using It’s important to also consider how im-
algorithmic ways of seeing as ways age data sets are themselves partial,
for exploring alternative visibilities
and to create new connections that could/should be included in the crea-
couldn’t be otherwise (see Pereira and tion of CV. What becomes possible if,
Moreschi for an example of using CV for example, there wasn’t an expecta-
to look at artworks through the lens of tion of bigness in data sets, with a
error). practice focusing on small data? (see
Prosthetic Memory
(3) Data level (who labels, which for an example of a custom-made
images are chosen, who takes the one-person machine learning tool).
photographs)
collecting images/data is refused, and
How can we refuse problematic data instead CV is trained on computation-
sets and their troubled histories? Vinay ally manufactured image data? Could
Prabhu and Abeba Birhane even that open a way of not even needing to
name computer vision a pyrrhic win collect people’s image data at all? (see
due to the “problematic practices and e.g. Harvey’s VFRAME project).
consequences of large scale vision
datasets” (1). Data sets often rely (4) Algorithmic/representational level
on the extraction of images without (e.g. Siamese convolutional neural
agreement from users (see Harvey & network with Adam gradient descent
LaPlace; Thylstrup), use precarious optimization)
labor of Amazon Mechanical Turkers
(Irani, “Justice for ‘Data Janitors’”), as Though much discussion critical of
well as orgazine seeing through labels hegemonic CV has tended to focus
that encode racist, classist, and sexist on the “data problem” (Hooker), the
histories (Hanna et al.; Crawford and
Paglen; Smits and Wevers). How can ways of seeing are possible, what data
designers instead create datasets in are valued, and the particular modes

37
APRJA Volume 10, Issue 1, 2021

through which problems are solved. could CV be developed in ways that


How could we refuse current modes of decentralize away from the power
measuring and quantifying? Outside of of Amazon’s AWS or the Microsoft
Cloud? Could these alternative
written about the historical construc- CV infrastructures allow individual
tion of alternative search algorithms practitioners to operate away from the
in Cuba and Brazil, which evaded control of tech companies? Likewise,
hegemonic notions of ‘relevance.’ could these computational systems
Departing from examples like this, how be developed in ways that are based
could machine ways of seeing break on nature rather than continuous
from predictive models and the simple extractivism? The Low Tech Magazine
(solar.lowtechmagazine.com) points
tags/descriptions? to alternative directions by hosting a
This also relates to how computer static website entirely through solar
code is created and the assumptions energy. These low-tech perspectives
that go into its development. Much could very much change how CV
practice, for example within Software is practiced, potentially away from
Studies (Soon and Cox), seeks centralized large-scale image data.
precisely to queer code, breaking with
the binary in algorithmic operations (6) Philosophical/axiomatic level (e.g.
through the practice of coding. This vision as inverse graphics)
practice based on software art goes
beyond the purely technical to engage This is the hardest level to refuse
with the writing of code as potentially because the philosophical underpin-
also poetic, critical, and material. How nings that contemporary CV stands on
could the code of CV be made more are particularly hegemonic. How could
visible, refusing its disappearance into CV operate from completely different
technical infrastructures? An example values and theories? Perhaps most
of this is the work of the Brazilian importantly of all, how could CV refuse
artist Waldemar Cordeiro, who over whiteness and colonialism, and its
50 years ago experimented with ways problematic categorizations and stand-
images could be represented and ardizations? As phrased by Rachel
analyzed by creating his own computer Adams, decolonial thought needs “to
algorithms. make intelligible, to critique, and to
seek to undo the logics and politics of
(5) Implementation/physical level race and coloniality that continue to
operate in technologies and imaginar-
Nvidia GPU) ies associated with AI in ways that
exclude, delimit, and degrade other
Where are the physical structures of ways of knowing, living, and being
CV located, and who owns them? This that do not align with the hegemony of
question leads to a political economy Western reason” (190). In this sense,
of infrastructures and to considering Couldry and Mejias remind us that it is
what infrastructures could instead
be used for doing this work. How with individual tactics, but to engage

38
Gabriel Pereira: TOWARDS REFUSING ...

in collective resistance (much like the of refusing are possible, and how they can
Luddites). support your practice.
A goal for practice then needs to be the
much wider change on how data and
algorithms are rationalized, which can
Conclusion: Performing
only happen through alternative institu-
tions and networks of practice. Though alternatives, bearing
not focused on CV, Sabelo Mhlambi witness to limitations
has written on how Ubuntu philoso-
phies could change AI’s paradigms. He
suggests AI could operate away from When the computer scientist Joseph
its dominant culture of personhood
based on rationality and individualism. of Computer Vision, due to his perception of
The Ubuntu framework for personhood the harms and injustices these technologies
is based on relationality and different were causing, he threw a wrench in the sys-
principles: “solidarity, reconciliation, tem. Hegemonic Computer Vision advances
equality, equity, and community” (15). these algorithms as natural developments,
What would it mean to, in fact, put neutral and impartial operations, even though
these into practice as guiding notions they’re mostly used for supporting multiple
in CV policy, for example? forms of surveillance. In this article, I’ve dem-
onstrated how refusing is a powerful counter-
Considering refusal as part of this wider hegemonic stance to this ‘common sense,’
stack serves to argue that counterhegem- especially through personal and collective
onic responses can happen across different antagonism to uncontrolled technological
areas, even in places that so far remain little development. Bridging this stance with criti-
acknowledged. It shows how refusal can cal technical practice’s focus on developing
have many different facets and intensities -
beyond the work of computer scientists like nical can serve to perform alternatives and
Redmon. It importantly highlights the radical pave the way to radical reimaginations – or at
work being done by activists, artists, com- least create some ‘band-aids’ that bear wit-
munity organizers, researchers, etc. While ness to how much work we still need to do.
useful, this stack is just one of many possible We needn’t only reform CV, but to de-
cartographies of action. The Stop LAPD part from an active refusal of its ideology and
Spying activist group, for example, suggests organizations, completely breaking it apart in
a framework for mapping surveillance divided pieces before building something new. Our
on different layers: “Ideological, Institutional, -
Operational, and Community.” Such a way of tions, but to refuse the unsettling ‘common
mapping illuminates how practice also needs sense’ of technological progress through
to consider how communities should be in- action. Much like the Luddites, refusing to
volved in the creation and use of CV. Which work and sabotaging, but also building new
communities should be centered that haven’t ways of seeing and generating new collec-
yet, and which are being marginalized by CV tive engagements with the visual world.
use? This essay only hopes to serve as a
starting point, so I leave it to you, the reader, to
consider what other ‘counter-cartographies’

39
APRJA Volume 10, Issue 1, 2021

Amoore, Louise. Cloud Ethics: Algorithms


Notes
and the Attributes of Ourselves and Others.
Duke University Press, 2020.
[1] We also have to be somewhat critical of
Redmon’s refusal to build CV, particularly Azar, Mitra, Geoff Cox, Leonardo Impett.
his late realization of what he was doing and “Introduction: Ways of Machine Seeing.”
his privileged position to refuse work while AI & Society, Feb. 2021, pp. s00146-020-
still being a white male US-American grad 01124–26. DOI.org (Crossref), doi:10.1007/
student (see the movie The Social Dilemma s00146-020-01124-6.
for an infuriating example of late realizations
by privileged computer scientists). Barabas, Chelsea. “To Build a Better Future,
Designers Need to Start Saying ‘No.’”
[2] What I refer as ‘common sense’ is Medium, 20 Oct. 2020, https://onezero.
originally referred by Gramsci, in Italian, as medium.com/refusal-a-beginning-that-starts-
senso commune. Although ‘common sense’ with-an-end-2b055bfc14be.
is the adopted English translation, it is
important to make clear that the Italian term Barabas, Chelsea, et al. Studying up:
used by Gramsci does not have the conno- Reorienting the Study of Algorithmic
tation of a good and reasonable judgement, Fairness around Issues of Power. 2020, pp.
which is instead referred to as buon senso 167–76.
(‘good sense’; see Crehan 43-58).
Buolamwini, Joy, and Timnit Gebru. Gender
Shades: Intersectional Accuracy Disparities
. 2018,
Works cited pp. 77–91.

burrough, xtine. “A Decade of Working with


the Working Crowd.” Media-N, vol. 16, no.
Be Decolonized?” Interdisciplinary Science 1, 1, Mar. 2020, pp. 116–40. iopn.library.il-
Reviews, vol. 46, no. 1–2, Taylor & linois.edu, doi:10.21900/j.median.v16i1.221.
Francis, Apr. 2021, pp. 176–97. Taylor and
Francis+NEJM, doi:10.1080/03080188.202 Cifor, Marika, et al. Feminist Data Manifest-
0.1840225. No, https://www.manifestno.com. Accessed
7 July 2021.
Agre, Phil. “Toward a Critical Technical
Practice: Lessons Learned in Trying to Couldry, Nick, and Ulises A. Mejias.
Reform Ai.” Bridging the Great Divide: The Costs of Connection: How Data Is
Social Science, Technical Systems, and Colonizing Human Life and Appropriating It
Cooperative Work, edited by Geoffery for Capitalism. Stanford University Press,
Bowker et al., Erlbaum Hillsdale, 1997, pp. 2019.
130–57.
Cox, Geoff. “Ways of Machine Seeing: An
---. Your Face Is Not a Bar Code. 2003, Introduction.” A Peer-Reviewed Journal
https://pages.gseis.ucla.edu/faculty/agre/ About, vol. 6, no. 1, 2017.
bar-code.html.

40
Gabriel Pereira: TOWARDS REFUSING ...

Crawford, Kate, and Trevor Paglen. Hooker, Sara. “Moving beyond ‘Algorithmic
Excavating AI: The Politics of Training Sets Bias Is a Data Problem.’” Patterns, vol. 2,
for Machine Learning. 2019, https://excavat- no. 4, Apr. 2021, p. 100241. ScienceDirect,
ing.ai/. doi:10.1016/j.patter.2021.100241.

Crehan, Kate A. F. Gramsci’s Common Irani, Lilly. “Justice for ‘Data Janitors.’”
Sense: Inequality and Its Narratives. Duke Public Books, 15 Jan. 2015, https://www.
University Press, 2016. publicbooks.org/justice-for-data-janitors/.

D’Ignazio, Catherine, and Lauren F. Klein. ---. “The Cultural Work of Microwork.” New
Data Feminism. MIT Press, 2020. Media & Society, vol. 17, no. 5, 2015, pp.
720–39.
Gangadharan, Seeta Peña. Technologies
of Control and Our Right of Refusal. Laclau, Ernesto, and Chantal Mouffe.
TEDx London. 2019, http://eprints.lse. Hegemony and Socialist Strategy: Towards
ac.uk/101157/4/Gangadharan_technolo- a Radical Democratic Politics. 2nd ed,
gies_of_control_remarks.pdf. Verso, 2001.

Gramsci, Antonio. The Gramsci Reader: Lee, Jennifer. “Creating Community-


Selected Writings, 1916-1935. Edited by Centered Tech Policy.” Affecting
David Forgacs, New York University Press, Technologies, Machining
2000. Intelligences, edited by Dalida Maria

Hamid, Sarah T. “Community Defense: https://book.affecting-technologies.org/


Sarah T. Hamid on Abolishing Carceral creating-community-centered-tech-policy/.
Technologies.” LOGIC Magazine, 2020,
https://logicmag.io/care/community-defense- Markham, Annette. “The Limits of the
sarah-t-hamid-on-abolishing-carceral-tech- Imaginary: Challenges to Intervening in
nologies/. Future Speculations of Memory, Data, and
Algorithms.” New Media & Society, 2020,
Hanna, Alex, et al. “Lines of Sight.” Logic doi:10.1177/1461444820929322.
Magazine, https://logicmag.io/commons/
lines-of-sight/. Accessed 26 Apr. 2021. McCosker, Anthony, and Rowan Wilken.
Automating Vision. Routledge, 2020.
Harvey, Adam, and Jules LaPlace.
MegaPixels: Origins and Endpoints of Mhlambi, Sabelo. “From Rationality to
Biometric Datasets “In the Wild.” 2020. Relationality: Ubuntu as an Ethical and

Harwood, Graham. “Teaching Critical Intelligence Governance.” Carr Center for


Technical Practice.” The Critical Makers Human Rights Policy – Harvard Kennedy
Reader:(Un) Learning Technology, edited by School, 2020, p. 31.
Loes Bogers and Letizia Chiappini, Institute
of Network Cultures, 2019.

41
APRJA Volume 10, Issue 1, 2021

Silva, Tarcízio, et al. APIs de Visão


Space and Racial Surveillance Capitalism.” Computacional: investigando mediações
AI & Society, Nov. 2020. Springer Link, algorítmicas a partir de estudo de bancos
doi:10.1007/s00146-020-01095-8. de imagens. no. 01, p. 30.

Mueller, Gavin. Breaking Things at Work: ---. Visão Computacional e Vieses


The Luddites Are Right About Why You Racializados: Branquitude Como Padrão No
Hate Your Job. Verso, 2021. Aprendizado de Máquina. 2019.

Algorithms of Sinders, Caroline. “In Defense of


Oppression. NYU Press, 2018. Useful Art.” Pioneer Works, 2020,
https://pioneerworks.org/broadcast/
Ochigame, Rodrigo. “Informatics of the caroline-sinders-in-defense-of-useful-art.
Oppressed.” Logic, vol. 11, 2020.
Smits, Thomas, and Melvin Wevers.
Pereira, Gabriel. “Alternative Ways of “The Agency of Computer Vision
Algorithmic Seeing-Understanding.” Models as Optical Instruments.” Visual
Affecting Technologies, Machining Communication, SAGE Publications, Mar.
Intelligences, edited by Dalida Maria 2021, doi:10.1177/1470357221992097.
http://book.
affecting-technologies.org/alternative-ways- Stop LAPD Spying Coalition and Free
of-algorithmic-seeing-understanding/. Radicals. “The Algorithmic Ecology: An
Abolitionist Tool for Organizing Against
Pereira, Gabriel, and Bruno Moreschi. Algorithms.” Medium, 3 Mar. 2020, https://
“Ways of Seeing with Computer Vision: stoplapdspying.medium.com/the-algorith-
mic-ecology-an-abolitionist-tool-for-organiz-
Critique.” AI & Society, 2020, doi:10.1007/ ing-against-algorithms-14fcbd0e64d0.
s00146-020-01059-y.
Soon, Winnie, and Geoff Cox. Aesthetic
Prabhu, Vinay Uday, and Abeba Birhane. Programming: A Handbook of Software
“Large Image Datasets: A Pyrrhic Win for Studies. Open Humanities Press, 2021.
Computer Vision.” ArXiv, 2020.
Thylstrup, Nanna Bonde. “Data out of Place:
Redmon, Joseph, and Ali Farhadi. YOLOv3: Toxic Traces and the Politics of Recycling.”
An Incremental Improvement. ArXiv, 2018, Big Data & Society, vol. 6, no. 2, 2019, p.
https://arxiv.org/abs/1804.02767. 2053951719875479.

Sadowski, Jathan. Too Smart: Digital Tubaro, Paola, et al. “The Trainer, the
Capitalism Is Extracting Data, Controlling
Our Lives, and Taking Over the World. MIT
Press, 2020. Intelligence.” Big Data & Society, vol. 7, no.
1, 2020, doi:10.1177/2053951720919776.

42
Gabriel Pereira: TOWARDS REFUSING ...

Vinsel, Lee. “You’re Doing It Wrong:


Notes on Criticism and Technology Hype.”
Medium, 1 Feb. 2021, https://sts-news.
medium.com/youre-doing-it-wrong-
notes-on-criticism-and-technology-hype-
18b08b4307e5.

West, Sarah Myers. “Data Capitalism:

and Privacy.” Business & Society,


vol. 58, no. 1, 2017, pp. 20–41,
doi:10.1177/0007650317718185.

Williams, Raymond. Marxism and Literature.


OUP Oxford, 1977.

Young, Meg, et al. “Municipal Surveillance


Regulation and Algorithmic Accountability.”
Big Data & Society, vol. 6, no. 2, July 2019.
SAGE, doi:10.1177/2053951719868492.

43

You might also like