You are on page 1of 6

1

FILM 2601B: Summary Response

Ashlyn Murray – 101231372

Professor Gunnar Iversen

February 9, 2023

Word Count: 984

Summary

The chapter Engineered Inequity: Are Robots Racist? written by sociologist Roha

Benjamin argues that the racial discrimination and prejudice of the historically hegemonic White

society has been unintentionally embedded in and reinforced through the supposedly neutral and

unbiased technologies designed and utilized today, particularly in the form of “racist robots”. 1

By using different forms of artificial intelligence (AI) and their deep learning systems as

examples, such as the social media apps Beauty AI and Facebook, Benjamin challenges the

common assumption that all technology is incapable of expressing bias and ultimately proves it

to be undoubtedly false.2

Specifically, in presenting Beauty AI’s shockingly biased contest results generated by

robots in favour of White participants, attention is brought to the broader socially influenced

processes that are rooted in the “naturally occurring” datasets and algorithms that control most of

today's technology.3 Additionally, Benjamin utilizes the concepts of social dimensions and social

1
Ruha Benjamin, “Engineered Inequity: Are Robots Racist?” in Race After Technology: Abolitionist
Tools for the New Jim Code (Cambridge: Polity Press, 2019), 34-35
2
Benjamin, “Engineered Inequity: Are Robots Racist?”, 34
3
Benjamin, “Engineered Inequity: Are Robots Racist?”, 34-35
2

crediting systems, such as the one found in China, as sources of comparison. 4 In doing so, the

biased results determined by Beauty AI’s algorithm and datasets claiming to constitute what is

considered to be beauty and heath can be acknowledged to only be in accordance with the

harmful views and goals of a racially discriminatory and hegemonic White society.5 Overall,

recognition is brought to the unintentional foundation and consequences of technology that is

entrenched in a longstanding and racially motivated societal desire to enforce and strengthen

social hierarchies and social capital in favour of the White population and more particularly, in

opposition to the historically marginalized communities.6

Critical Evaluation

Engineered Inequity: Are Robots Racist? is an academically written chapter that can be

found within sociologist Ruha Benjamin’s book Race After Technology: Abolitionist Tools for

the New Jim Code and contains many insightful claims dismissing the commonly held false

assumptions about the supposedly neutral and unbiased capabilities of technology. 7 While

reading it can be recognized that this chapter is carefully divided into separate sections using

subtitles that each surround certain aspects of the author's ideas and arguments. Further,

throughout each of these sections a variety of supporting secondary sources and examples have

been rigorously selected and effectively utilized for the purpose of analysis and comparison.

4
Benjamin, “Engineered Inequity: Are Robots Racist?”, 45
5
Benjamin, “Engineered Inequity: Are Robots Racist?”, 34-35
6
Benjamin, “Engineered Inequity: Are Robots Racist?”, 45-46
7
Benjamin, “Engineered Inequity: Are Robots Racist?”, 34
3

To be more specific, Benjamin began this chapter by placing focus on one valuable

example in particular, the first ever beauty contest judged by robots through the social media app

Beauty AI.8 Using the shocking results generated by the robots within the app in favour of White

participants, Benjamin provided readers with a real-life example that demonstrated the

unexpected bias deeply embedded within AI and other technologies. 9 In doing so, a guide was

ultimately created for the rest of the chapter to supportively follow and build upon.

Some prominent examples that were subsequently presented included an abundance of

peer-reviewed secondary sources. These sources were critically analyzed to identify the root

causes and effects for the technological biases that were previously demonstrated by the results

of Beauty AI. Resultantly, the roles and intentions held by those responsible for the creation of

technology both historically and presently, humans and the “naturally occurring” data they each

generate, gained notable attention. 10 Building off of these concepts, broader socially influenced

processes, such as those rooted in a desire for dehumanization and power, were identified and

examined.11 Essentially, these sources introduced readers to the socially embedded

discriminatory desires and consequences within technology that have resulted from the historical

influence of a hegemonic White society.

Additionally, the definition of racism and the implications associated with it were

analyzed within the chapter. Most importantly, Benjamin addressed that it is possible for racism

8
Benjamin, “Engineered Inequity: Are Robots Racist?”, 33
9
Benjamin, “Engineered Inequity: Are Robots Racist?”, 34
10
Benjamin, “Engineered Inequity: Are Robots Racist?”, 39
11
Benjamin, “Engineered Inequity: Are Robots Racist?”, 36-39
4

to exist without an intent to harm unlike many people assume.12 While some racist actions can

undoubtably be attributed to individuals and their own discriminatory feelings towards certain

groups, it must be also recognized that the racist actions of others can be attributable to the

values of White hegemony that are entrenched in society rather than their individual feelings. To

be more specific, the chapter explains that the common link between racism and harm exists

because racial discrimination and prejudice are continuously reinforced because of what the

broader socially influenced processes have established as being routine, reasonable, intuitive, and

codified.13

Finally, comparisons were used to explore social dimensions and social credibility in

relation to the desires of White hegemony entrenched within technology. In order to do so,

multiple examples were utilized throughout the chapter. These included social media apps like

Facebook, Google, Instagram, and Netflix as well as an automated soap dispenser and larger-

scale concepts like Chinas ranking system and ideas expressed by Donald Trump during his

presidency.14 Using these examples Benjamin was able to distinguish between the harms and

benefits associated with collecting data produced by socially biased individuals that is then used

as a source of direct guidance for the presumably unbiased technology that exists in almost all

aspects of our everyday lives. 15 Through this distinction greater implications were able to emerge

surrounding the consequences that collecting data for these purposes has on the shaping of

technology users as well as general citizens behaviour that can be found to be in accordance with

12
Benjamin, “Engineered Inequity: Are Robots Racist?”, 40-41
13
Benjamin, “Engineered Inequity: Are Robots Racist?”, 41
14
Benjamin, “Engineered Inequity: Are Robots Racist?”, 45-47
15
Benjamin, “Engineered Inequity: Are Robots Racist?”, 45-46
5

pre-existing hegemonic and socially hierarchical views that continuously oppress marginalized

groups.16

Taking all of this into consideration, I agree with Benjamins claims and find them to be

extremely useful. Due to the extensive amount of research and analysis applied throughout the

chapter, I was able to reflect on my own experiences with technology in relation to Benjamins

claims. In doing so, I came to recognize the broader socially influenced processes that are

entrenched within everything I do, and more particularly, directs all of the information I consume

and how I consume it. Most importantly, however, these claims are useful in general as they

open readers eyes to the blatant biases encoded within the commonly assumed “neutral”

technology that is used daily and specifically to the ongoing racially discriminatory and harmful

consequences that result from it.

16
Benjamin, “Engineered Inequity: Are Robots Racist?”, 46-47
6

Bibliography

Benjamin, Ruha. “Engineered Inequity: Are Robots Racist?” In Race After Technology:

Abolitionist Tools for the New Jim Code, 33–52. Cambridge: Polity Press, 2019.

You might also like