You are on page 1of 35

Embodiment and disembodiment in technology

The internet is not in the cloud. What sustains the virtual is very real. The most

modern optical cables under the sea follow colonial shipping routes (fig. 0), and

sharks can bite them apart. Considering the digital and the real not as divided but

mutually interpenetrating and overlaid, this text looks at the politics and

technologies that reads and writes bodies. Technologies, as engineered machines*

or developed systems (e.g.cyberspace*), support as well as exploit bodies.


<machines/> when talking about her mother who works at a hospital, who was

moved to a three monitor workstation, Sondra Perry questioned the function of

machines as tools that prolong and makes-more-efficient the work bodies, bodies

being used and seen, being allowed to live longer to perform a function. (Fred Moten

in conversation with Sondra Perry, 2018) </machines>

<cyberspace/> ‘Cyberspace’s interfaces are perfectly hegemonic... They are enforced

and informed by dominant ideologies... infrastructure and design limitations’

(Nakamura, 2002, pp.137)’ </cyberspace>]

I looked for examples where technology writes or fails to write, reads or fails to

read non-white non-cis bodies, and strategies where the fantastic faliures* [queer

bodies fails fantastically glitching the tech (Russell, pp9)] are used to

dismantle=misuse(repurpose)+recycle technology. There are four case studies:

1
1) Disembodied AI assistants like Alexa hide laboring bodies.

2) Cameron James-Wilson, a white man, created and profits from the black digital

model, Shudu.

3) CV Dazzle, as a strategy, enables bodies to be (more) illegible to the machine.

4) Black Trans Archive by Danielle Brathwaite Shirley centres Black Trans experiences

rather than recognisable bodies through video-game’s world/body-building.

This essay starts from the following:

● How does technology (dis)embody?

● Representation for whom and by whom?

● Instead of inclusivity, Illegibility!

● Center experiences rather than bodies. (We cannot be represented!)

The conventional association of the physical body with the real self is questionable.

Bodies=Avatars* are equally (un)real.


<avatar/>The original Sanskrit meaning of the word avatar is ‘descent’ (Britannica, 2019),

indicating downloaded deities in human or animal forms in Hinduism. In everyday use today,

an avatar can be a virtual character or an uploaded profile icon that the users embody in

digital environments. </avatar>

In both definitions an avatar is the skin* people or deities slip under to act or speak.
<skin/>‘Skin can appear as a boundary that contains and separates the self from the world, it

is, in its tactility, a reminder of human inseparability from the world(s)’ (Fondation Brocher,

n.d.). </skin>

In this sense, the physical body*[aka. the digirati’s wetware] is as much an avatar as,

say, the gothic vampire character I created for myself in Second Life.1 These avatars,

1
Second Life is an application that allows people to create an avatar for themselves and have a
second life in an online virtual world, or metaverse.

2
responsible for their actions and speech across worlds&platforms, are equally

authentic.

The process of embodiment is not to be mistaken as having a set of identifiable

bodies. It emphasises instead the process of experiencing through the bodies and

the passing on of experiences enabled by operating multiple bodies. Embodiment

is a messy process. Acquired or arrested by the avatars and profile pictures, are

residues, new attributes, and blisters of wearing new skins. The many

worlds&platforms collapse in the recurring habits, realised dreams*, and haunting

nightmares* of the bodies.


<realised dreams/>Projekt Melody, a 3D anime-styled live streamer, helps many to find their

fictosexuality2 through cyber-sex. </realised dreams>

<haunting nightmares/>People who deployed black characters are shouted racial slurs at in

the video game Red Dead Redemption 2 (Hernandez, 2019). </haunting nightmares>

‘There is no return to the concept of “the real”, as digital practice and the visual

culture that has sprung from it has forever reshaped how we read, perceive,

process all that takes place AFK*’ (Russell, 2020, pp.45).


<AFK/>Afk is an abbreviation for ‘away from keyboard’. It is an alternative way to indicate the

state of being offline (temporarily) that does not conform to the narrative of the

digital/physical divide.</AFK>

Acknowledging the messy field of embodiments across worlds&platforms, AFK

avatars/bodies should be singled out as they labour to make possible the other

worlds. As the first points of exploitation, conditioned by racial, gender and class

divides, AFK avatars/bodies, are spent, rendered invisible, and denied

2
‘an umbrella term for anyone who experiences exclusive sexual attraction toward fictional
characters, a general type of fictional characters, or whose sexuality is influenced by fictional
characters’ (lgbta.wikia.org/).

3
access/time/resources/energy to enter the other worlds. We will begin exploring

the exploitation of embodiment in the case of Alexa and Shudu Gram.

Your introduction is very strong now, but I wonder if it is worth spending a

moment to also discuss what disembodiment is, since it is a major term that

will be importance with, e.g. Alexa, and hasn’t come up yet. What is the

relation between embodiment and disembodiment? It seems it is not a simple

opposition or binary, or a set-up where embodiment=good and

disembodiment=bad (or the opposite), but something more complex and

situational. I want to know what you think.

Alexa: Disembodied design and labour


Technological devices often come with synthetic voices that are disembodied.

Although feminine voices accompany accessible tech such as public transport as

well as smart tech such as home sound assistants, we do not often hear the stories

about the femme3 voice casts for Alexa or Siri. Are their synthetic voices really of

the machines? Although it is the text-to-speech algorithms that put sentences

together for the sound assistants, a real person must record their voice to provide

the machine with material to work with. Apple never acknowledged Susan Bennett

as Siri’s voice cast (Ravitz, 2013), nor did Amazon acknowledge Nina Rolle as Alexa’s

voice (Vincent, 2021). As for the male voice of Siri in the United Kingdom, Jon Briggs,

he was warned by Apple to not speak publicly about Siri. The result is that the

disembodied feel of tech is preserved.

3
The term femme is used here to describe the voice as feminine-identifying without assigning a gender to
the voice.

4
Under the CNN youtube video clip ‘Meet the real woman behind the voice of Siri’

posted in 2013, one of the comments reads, ‘Well now I feel bad for all the times I

told her to shut up 😭’. Taking mechanized femme voices as disembodied not only
reinforces the gender bias of females in assistant roles (Manton and Campbell,

2021) — invisibly blending into the background yet always available if you need

her,4 but also perpetuates abuses. Behind the entertaining advancement of

disembodied but gendered tech, sits an industry of mostly male-identifying

programmers, technologists and designers. The misogynist idealisation of machines

as femme but not yet human* allows for Siri to respond to ‘you are a slut’ with ‘I’d

blush if I could’ (Fessler, 2017).


<machines as femme but not yet human/>

Ex.1 ‘Vehicles, including ships, cars, trains and even engines often take the feminine gender,

especially in informal contexts and when spoken of by men’ (Antidote, 2017).

Ex.2 In cyberpunk novels and films, there exist the tropes of femme asian bodies being

overtly-sexualised and fetishised as cyborgs, whose main purposes are to care for the

emotional needs of male characters though not showing much emotion themselves

(Diversify Our Narrative Campaign, 2021).

4
Alexa’s far-field voice recognition allows her/it/them to hear you from everywhere.

5
</machines as femme but not yet human>

But sound assistants are not always advertised as being without a body. In

Amazon’s Big Game Commercial: Alexa’s Body published in Feburuary 2021, Alexa

was given the body of Michael B. Jordan. In the ad, the female user fantasised about

many romantic scenarios with Michael B. Jordan/Alexa, including sitting in a bubble

bath and being read an audiobook (fig. 2). The video description says ‘we've found a

new body for Alexa. Who knew Alexa had abs?’ This statement raises many

questions. What was the old body of Alexa if this one is new? Was it the suppressed

association between Alexa and Nina Rolle? Or was it the avoided narrative of Alexa

as one of the many femme bodies who labour in homes? After all, it is ‘not cool’ to

imagine an all-too-familiar Asian nanny or femme secretary cheerfully walking

around the house, switching on and off the lights and telling jokes. But is it

6
acceptable and even desirable if the embodiment is of a famous Black man like

Michael B. Jordan?

Why did Amazon give Alexa Michael B. Jordan’s abs, if not to further the

mystification of technology as being able to provide users with more (exciting)

bodies one can use (if not abuse) without consequences? In deliberately associating

or dissociating bodies with machines, we are left with the historical tendency of

enslaving* certain bodies in the care-free process of mechanisation.


<enslave/>The word robot is developed from the word robotnik (Czech), meaning forced

worker, rabota (Old Church Slavonic), meaning servitude, and rabu, meaning

slave.</enslave>

This concerns not only dehumanising femme bodies but also commercialising

blackness, which will be further explored in the next chapter. The question of

where exactly are the abs of Alexa should be answered in terms of which laboured

bodies make tech possible, smart and affordable.

Sociologist Ruha Benjamin ’ has remarked, ‘at the same time that Amazon ran the

Alexa ad, the company was trying to crush the unionisation of mostly Black and

Latino workers in its Alabama warehouse.’ ‘Fantasizing bubble bathing with a sexy

black Alexa’ is far from these workers’ experience of embodying technology*

(Benjamin, 2021).’
<embodying technology/>The store and warehouse workers’ bodies are under constant

technological surveillance. Amazon uses navigation software, item scanners, wristbands,

thermal cameras, security cameras and recorded footage to boost the workers’

performances. (Hanley, 2021) </embodying technology>

Kate Crawford and Vladan Joler’s Anatomy of an AI System takes an even closer look

at how embodying Amazon Echo is experienced in the expenditure of bodies. It is

the users, unpaid or low paid students, volunteers, interns, crowdworkers and

7
outsourced services in developing countries who trained the datasets for Alexa to

be smart. In upholding a disembodied if not wrongfully embodied tech, spent are

the bodies of the assemblers, component manufacturers, smelters, refiners and

miners who work for low pay, work unscheduled overtime, work illegal hours, work

with toxic materials, neurotoxins, (such as arsine, phosphine and others substances

that expose workers to health hazards such as cancer, miscarriages and birth

defects), work under low-frequency electronic magnetic and radiofrequency

radiation, low-level radiation, and airborne metal, work through hard labour, work

as the forced or child labour (Crawford and Joler, 2018).

Shudu Gram: ‘embodied’ representation without experience


Being able to pass as something other than one’s meatspace5 self leads to the

wrong belief that one’s online being is separated from their meatspace self, as if the

differences and injustices of the latter could just be left behind (Wark, 2020). Those

whose AFK bodies laboured to make technology possible, who are rendered

invisible in disembodiment, tend also to have less of an online presence. Study

shows that young people who are African American, Latinx, low income, participate

in English as Second Language programs, and/or lack adequate housing are less

likely to have access to the internet (Schaffhauser, 2020) (US); 17 out of 30 Black

and Latino individuals who are HIV positive and at risk for cardiovascular events

have no internet, no computer or lack knowledge of how to use the internet or a

computer (Adkins-Jackson, Brown and Loeb, 2021) (US); ‘only 51% of households

earning between £6000-10,000 annually have home internet access compared with

5
The meatspace, commonly known as the physical world, as opposed to cyberspace or a virtual
environment.

8
99% of households with an income of over £40,001’ (Holmes and Burgess, 2020)

(UK).

The required necessary devices (laptops with sufficient graphic card and memory

space £419 - £1899, drawing pads £35 - £724, etc), software licenses (Photoshop

around £238.42/month, Autodesk Maya £306/year, etc), game subscriptions, and

free time and energy to learn how to use the relevant tools, make it even harder for

members from underprivileged communities to operate many avatars in many

worlds. ‘In spite of the claims that everyone is the same in virtual worlds, access to

technology and necessary skills will effectively replicate class divisions of the rest of

reality in the virtual spaces’ and ‘will tend to reinforce existing inequalities, and

propagate already-dominant ideologies’ (Nakamura, 2002, pp. 46). It is in this

context we talk about avatars and representations. One example that shows how

the difference in access can result in superficial embodiments— privileged white

people manipulating Black or ‘of colour’ bodies into trendy fashion things — is

Shudu Gram.

9
10
Shudu (fig. 3) claims to be one of the first digital supermodels. She has 218k

followers on Instagram. She is femme, has a dark skin tone and often wears short

afro hair. The creator of Shudu is Cameron James-Wilson, a white-presenting British

fashion photographer and visual artist, who, claims to be inspired by South African

Princess Barbie dolls (Netizens, 2019). Despite many people finding it disturbing to

see a white man manipulating a black-presenting femme body (fig. 4, 5), on the

official website of Diigitals6, race was not mentioned at all in terms of Shudu’s

relation to her creator. Instead, she is framed as James-Wilson’s ‘self-expression’,

‘incorporating many aspects of her creator's interests’ (The Diigitals, n.d.).

6
The digital model company founded by Cameron James-Wilson which owns Shudu alongside many
other digital models.

11
White men creating black, brown, and of-colour virtual avatars, as ‘self-expressions’

is not unfamiliar in cyberspace. Lisa Nakamura, in her 2002 book Cyber Types: Race,

Ethnicity, and identity on the Internet, looked at how the acquisition of an avatar,

often conceived as liberatory in challenging the racial and gender divide, can result

in identity tourism* that ‘perpetuates old mythologies about racial differences’

(Nakamura, 2002, pp.xv).


<identity tourism/>Identity tourism, or ‘online recreational passing’, is when users perform in

avatars or characters ‘differently raced from the user’ in stereotypical ways. ‘Tourists operate

from a position of privilege and entitlement; to be a tourist is to possess mobility, access,

and the capital to satisfy curiosities about “native” life’ (Nakamura, 2002, pp.xv) < /identity

tourism>

12
The overpopulated Geisha and Samurai avatars, mostly performed by white

players, are examples of exoticising the bodies of the ‘other’. This is demonstrated

in the case of the Diigitals when Galaxia, a turquoise-skinned alien avatar, is

presented as part of the ‘portfolio of diverse digital identities’ (fig. 6) alongside

Shudu (dark-skin femme), Dagny (nordic femme), Brenn (dark-skin femme), Koffi

(?!7) (dark-skin male-presenting), J-YUNG 준영 (East-Asian male-presenting) and

Boyce (brown-skin male-presenting). What is shared between the dark/brown/of

colour bodies and the alien that James-Wilson is so eager to ‘represent’ if not a

sense of otherness? As Celeste Hay observed, ‘while Shudu alone is (a) poor and

fictitious representation of Black humans online, her association with Galaxia

pushes her out of this world’ (Hay, 2021).

The presentation of race or gender, or the diversity per se, of James-Wilson’s

models, come without experience. He ‘picked parts of these women that he liked

and created this unreal figure that doesn’t have to go through the struggle,

rejection or abuse within (the) industry that many of these hard-working Black

women have to in their every day’ (J, 2020). Maybe Shudu does get hate comments

on Instagram, but does it hurt James-Wilson as much? Behind Shudu taking up

space as a strong representation of Black femme bodies, is James-Wilson executing

the powerful ‘white gaze (that) allows Black people to be reduced to flimsy shells of

blackness… to the confines of the Black body’ and his ‘usurpation of Black bodies as

images for white capital gain’ (Hay, 2021).

Countering the criticisms, the Diigitals claims to have ‘collaborate(d) with creators

from emerging economies and under-represented communities’. Indeed, there are

many black and brown bodies behind Shudu, such as Misty Bailey and Alek Deng
7
​Coffee is known as a crop of European colonialism. (ucsc.edu)

13
Malek who modelled for Shudu, who also, later on, was photoshopped from the

final images8. The Diigitals website calls them muses, despite the fact that they are

treated as tailorable supplementing materials to help fulfil established fantasies of

a white man. Misty said ‘(Shudu) opens up opportunities for black models in the

fashion/beauty industry that look similar to her’ (Bailey, n.d.), spilling the truth

that even when bodies with experiences of race are included in the project, it is

James-Wilson, who gatekeeps which or what kind of Black body gets to be

represented.

Only temporarily, when Ama Badu, Shudu’s ghostwriter, speaks as Shudu in an

interview, did Shudu recognise her lack of the capability to experience her body.

Shudu/Badu wrote,

‘We can make the life we want to live without actually living it… I wish

everyone could live the reality they create for themselves on social media.

Until that happens in a genuine way, I think it’s important to remember that

not everything we see online is real’ (Shudu (Ama Badu ), 2019).

8
Many companies do not have 3D models of their products for Shudu to wear, this is when real
models’ bodies are required.

14
CV Dazzle: making excluded bodies more illegible

Race is not scientifically meaningful in biology— the sequencing of the human

genome shows that racial groups are not genetically discrete (Smedley and

Smedley, 2005). And it is in racism— stereotypical/exoticised representation of how

the racialised bodies are and the way the bodies are thus treated— that race is

experienced. A pixelated Obama image being AI-enhanced into a white-passing

male (fig.7) is only one example of how systemic issues are hard-wired in tech.

When tech is ‘designed by white men and tested on white men, that it works best

on white men is therefore hardly a surprise’ (The Economist, 2021). The machines

15
are biased. The facial identification system is 34.4% more likely to misidentify or fail

to identify darker skin tone females than lighter skin males (Buolamwini, 2018).

They expect to see lighter men.

Such a coded gaze* has real-life impacts.


<coded gaze/> A term coined by Joy Buolamwini, referring to the biased facial identification

system that is applied in more complicated tools such as facial recognition and predictive

analytics.</coded gaze>

Facial recognition determines who gets hauled away by the police. In 2018, the

Metropolitan Police’s trailing facial recognition system wrongly matched a

14-year-old Black schoolboy to an individual on the watchlist. The boy ‘was held by

four plainclothes police officers, questioned, searched and fingerprinted to check

his identity’​​(McLean, 2020). As for predictive analytics, which too uses facial

identification technology, it ‘disproportionately identifies Black people as high risks

and prevents them from buying homes, getting loans, or finding jobs’ (Awere, 2020).

Even when the big tech companies make progress in addressing these issues, they

do so only after being called out. Companies such as IBM, Face++, Microsoft, made

progress closing their facial identification gender/skin-type gap after being targeted

by Gender Shades9. But the companies not previously targeted, such as Amazon,

still perform badly identifying darker skin tone females (Buolamwini, 2019). The

considerations to alleviate harm to Black and brown bodies are not embedded in

the design of technology; inclusivity only comes as an afterthought (Benjamin,

9
‘Gender Shades is a preliminary excavation of the inadvertent negligence that will cripple the age of
automation and further exacerbate inequality if left to fester’ (Buolamwini, 2018).

16
2021), which is always too late.

There are strategies other than passively waiting/tiringly asking to be included.

Computer Vision Dazzle (fig. 8), is an example of finding power in embracing the

dispersed identity of being (even more) illegible. In the technologically mapped AFK

world, a body being visible and readable means that they are trackable and

traceable. CV Dazzle is a concept, instead of a pattern or product, that provides tips

on camouflaging from face-detection technology with fashion ‘looks’. It serves as

guidelines and inspiration for people to explore designs relative to specific

algorithms and different faces. Even though CV Dazzle was introduced in 2010, ten

years later in 2020, it became one of the many ways people shared to help protect

BLM10 protestors (Valenti, 2020).

Influencer Marty @martymoment designed several anti-surveillance makeups

based on CV Dazzle in June 2020. They tested out the CV Dazzle strategies and

updated the anti-surveillance knowledge using accessible facial ID tools such as the

10
Black Lives Matter.

17
ones built into iPhones or Instagram filters. In trying out the existing advice, Marty

figured out that the algorithm is a lot smarter now. Some tips, such as ‘partially

obscuring one of the ocular regions’ or ‘obscuring the nose-bridge area’ are

outdated. The systems recognized Marty even when they were wearing makeup

and an eyepatch. Wearing masks that cover the nose/mouth did not work either.

They updated the CV Dazzle information in letting people know ‘(the algorithms)

need only to see a fraction of one of these facial key points: eyes, nose and the

mouth to identify where other parts of the face may be’ (@martymoment, 2020).

Marty also added practical advice tailored to the specific events of BLM protests.

She noted in her post that people should avoid using oil-based makeup which

would bond with tear gas (@martymoment, 2020). They suggested that for

protestors going under the sun, using jewels in make-up efficiently helps stop the

algorithm from seeing faces. Two of their jewelled looks (fig. 9), one with big

colourful gems covering their face like colourful splashes, another with a black

18
milky way of grayscale diamonds running across their face, gave examples of

fabulous ways of protecting bodies from being read, misread and abused.

‘The whole concept of visibility assumes that you are not in a system that wants you

dead’ (Perry, 2018). To reject the readability that accords to standard social and

cultural coding (e.g. to be white, to be cisgender, to be straight) extends safety to

the othered bodies and renders them un-surveilled (Russell, 2020, pp.85). Both CV

Dazzle and Marty acknowledged the coded gaze and made it work for them. One of

Marty’s key points, ‘If you’re black or POC, you will have an easier time tricking

technology because algorithms have racial bias’ (@martymoment, 2020),

corresponds to CV Dazzle’s claim that the strategy ‘probably works better’ for

darker skin-tone people because ‘facial recognition systems were trained with

biased data sets that do not include enough data to learn separable visual

characteristics of underrepresented faces’ (Harvey, 2010).

Refusing to be readable is the alternative to being recognised under certain names.

To be identifiable and representable requires that bodies be assigned a singular

fixity in the existing categories/options offered, which is a ‘gorgeous proposition

that often ends tragically’ because it reduces the ways our bodies can be read

(Russell, 2020, pp.73). Though Facebook11 offered 54 genders in 2013, the user still

needs to choose one. Marty and CV Dazzle found power elsewhere than being

categorised ‘correctly’. They take up a kind of power commonly assigned to

whiteness, which is being undefinable thus invisible. In the ‘spillage’ of bodies lies

the statement ‘we cannot be represented (Stefano Harney and Moten, 2013, p.10)’.

Being illegible, bodies are ready to experience blurriness*

11
Now facebook has 14 genders to choose from plus a custom free-form.

19
<blurriness/>Who we are bleed into one another.</blurriness>

, hybridity
<hybridity/>Nakamura discussed, in the book Cybertype, the difficulty when navigating

website portals to assume one’s ethnicity, to be Asian, and one’s nationality, to be American,

at the same time. Hybrid identity, or to be many at the same time, is made impossible in the

self-identifying box-tickings commonly required when filling up a user profile or Equality and

Diversity monitoring form.</hybridity>

, and transiness.
<transiness/>Transiness means we are on the move! More to be explored in the next

section.< /transiness>

20
Danielle Brathwaite Shirley: centring Black Trans experiences

‘The right to define what a body is, in addition to who can control these things

called bodies, has never been meted out equally’ (Russell, 2020, pp.35). The facial

21
recognition technology that fails to recognise darker skin-tone females, has a 100%

failing rate identifying genderqueers because bodies beyond the binary are not an

option written into the algorithm. In the coded gaze, it is whiteness/cisness that

defines what/which body is read as human. Black British trans artist and game

developer Danielle Brathwaite Shirley says ‘a trans body is often a body that you’re

not expecting to see’. Acknowledging this, she/they in her/their work finds pride in

being ‘other’ and appreciates the power of not passing*


<passing/> ‘In the context of gender, passing or blending is when someone, typically a

transgender person, is perceived as cisgender instead of the sex they were assigned at birth.

The person may, for example, be a transgender man who is perceived as a cisgender man.’

(Wikipedia)</passing>

—Brathwaite Shirley writes, ‘I identify as a Black Trans Demon. My pronouns are

they. I am definitely not a human’ (Brathwaite-Shirley, 2017).

Collaborating with eighteen Black Trans people from the community, Brathwaite

Shirley created the video game Black Trans Archive. In the game, the characters,

22
mostly trans, are not straightforwardly recognisable as raced humans. Brathwaite

Shirley refers to the work as a mash*


<mash/>noun. a soft mass made by crushing a substance into a pulp, sometimes with the

addition of liquid.</mash>

where she/they created monster-like Black Trans Ancestors, such as the ones the

player can choose to resurrect, which, appear to each have three heads, hay-like

texture for the skin, and fabric texture indicating general facial features (fig. 12).

These features claim beauty and normalcy of Black Trans bodies by overwriting the

rules that conventionally define what a worly body is (as opposed to bodies that are

‘otherworldly’).

Brathwaite-Shirley manipulated images of Black and Black trans people’s hands,

feet, and hair into grassy or bio textures that wrap the game as skins. Through a

process of digital parthenogenesis*, the cutting up, collaging, reworking and

remapping of existing Black Trans imageries reproduced Black Trans bodies in

encryptions.
<parthenogenesis/> Parthenogenesis is a form of asexual reproduction in which growth and
development of embryos occur without fertilization by sperm; It is the development of an embryo
from an unfertilized egg cell in animals, and apomixis (asexual reproduction through seeds) in
plants. (Wikipedia) </parthenogenesis>

The encrypted bodies, illegible, work against the exploitative hypervisibility* where

readable bodies are reduced to symbols of trauma— ‘the horrible pornography of

death (Hook, 2013)’.


<hypervisibility/> Being hypervisible is the result of an individual being recognized for their

'otherness' or deviance from the norm (Settles, Buchanan and Dotson, 2019). Hypervisibility

is understood as opposed to the invisibility of whiteness or heterosexuality.<hypervisibility>

23
The archive rallies against Black Trans tourism* not only in its character-building

but also in its rules and/or instructions.


<Black Trans tourism/>the tourists expect to consume images of Black Trans bodies in

pain</Black Trans tourism>

The archive underlines the player’s identity/positionality at the very beginning of

the game where one has to declare whether they identify as, Black Trans, trans, or

cis. The identification determines the player’s access to and experience of the

archive. To Brathwaite-Shirley, a white cis person playing the game as if they could

step into a Black Trans person’s shoes is wrong. The archive should not and does

not trust cis people. The non-Black or non-Trans tourists must work to stay in the

space. As the terms and conditions that popped up after I clicked on ‘I identify as

trans’ says, ‘YOU MUST AGREE TO SUPPORT BLACK TRANS PEOPLE TO REAP THE

REWARDS OF BEING IN THEIR PRESENCE’. To breach this agreement means being

expelled from the game. If I refuse to use my visitor privilege to resurrect Black

24
Trans ancestors in the game space of Black Excellence, the game will end, with a

statement— ’if you cannot support us you cannot be around us’.

Fiction/fantasy is a powerful tool in remapping the AFK world that we assume to be

reality*.

<reality/>‘No way where we are is here (Harney and Moten, 2013, pp.94)’. </reality>

Black Trans Archive, as a game environment, shapes at least players’ perceptions

and expectations of the AFK world it mirrors. In fantasizing, proposing alternative

ways to be, Black Trans Archive rejects pessimism, the tool of white (cis) supremacy,

that doesn’t want you to imagine otherwise (@noname, 2020). In the game there

are various locations. Cis city, as one of the locations, ‘uses surveillance to try and

control our bodies, search our bodies for clues’. Cis city eventually gets dissolved in

the game, and ‘every Black Trans person who lived there gets re-housed in more

positive environments.’ ‘Ur body ur choice. inc’, another location in the game, offers

unlimited hormones that people desire. At this place, one gets free hormone

25
treatments without filling out forms, waiting in lines. The archive also imagines

kindness to be expressed in the interaction between the player and the game

characters. The player can become part of the security team to accompany Black

Trans sisters walking along a path of harmful gaze, or lift weights off of Black Trans

people by whispering nice things to them. Building on acts of care, the imagination

holds overlapped, overlaid and oversaturated bodies of all kinds.

It emphasises ‘we are here’ knowing that it is ‘because of those that are not’ (fig. 14).

Always in diasporic travels, from bodies to bodies, from Cis City to Trans Temple,

the resurrected ancestors ask the player, ‘you have put me in this virtual body… Do

I have more control over this body than my body before?’ (fig. 13). As such, the

characters that journey across times and worlds confronts the player with the

question of whether being uploaded/downloaded is empowering. The answer does

not have to be yes, but the questions and requests are points of entry for players to

take up agency and responsibility in a world-building in which their actions and

reflections affect Black Trans life before, here, and after. What is fictively imagined

and participated bleed out to spaces other than the game hosted under

https://blacktransarchive.com/. When we morph into other avatars occupied, we

still dream of the work done/to-be-done, kindness practised/to-be practised, in the

Archive. They remind us there is a lot of work that needs to be done because ‘no

way where we are is here… Even though we already are. We’re already here,

moving’ (Harney and Moten, 2013, pp. 19).

26
Conclusion/It’s getting there...

The first section of this text considers the material basis and laboured bodies spent

in tech while disembodied in representations, the invisible labour behind Alexa.

Access to technology determines who gets to acquire more than one body. The

possibility to build any body, to journey between bodies, is power, and people may

abuse it. In section two we found out with the case of Cameron James-Wilson and

Shudu that it costs neither experience nor struggle for a white creator to create

bodies that ostensibly represent Black people online and profit from it.

Section three and four celebrated the power of passing, not passing as anybody

(recognisably raced or gendered) using technology, but passing as illegible bodies in

27
all the fabulous manners before the technological gaze coded white and cis. BLM

activists put on makeup [noun. cosmetics] to confound a facial recognition system

that fails to see darker females. Black Trans Archive makes up [phrasal verb 1.

invent a story or plan; 2. (of parts) compose or constitute a whole] bodies that are

undefinable, in-the-wake, on-the-move, and mutating, denying the humanity that

denied Black Trans bodies, denying a ‘reality’ (the Cis City) that denies Black Trans

lives.

This text is suspicious of representations. We are interested in the process of

embodiment as experiencing through out many bodies/avatars/skins. What do they

do? What experience do they inflict on others? The graph above illustrates the

points of entry in considering giving/deploying bodies in this essay:

X: from ‘material basis, the cost to create/sustain/protect these bodies’ to

‘immaterial free-floating, imagine!’// acknowledge the problem of access

associated with the bodies that labour and profit; responsibly imagine the

bodies/avatars we use to travel to or experience realities other than the AFK

one.

Y: from ‘visibility/legibility’ to ‘invisibility/illegibility’// false embodiment is

examined based on axis X. Illegibility is celebrated as a strategy to challenge

the pale/cis coded gaze.

Z: from ‘what they experience/ do’ to ‘what “are” they’// acknowledge

bodies/avatars as the tools we experience or interact with the

worlds/platforms, be suspicious when they are used solely as signifiers for

profiling based on axis Y.

28
Talking in terms of embodiments and disembodiment instead of just

representations, this text asks why do we need to be represented? Representation

for whom? As reflected in the later sections of this essay, there is an emphasis on

rejection in being represented/representable. To be represented requires the

being-represented to have an outside, and address the self to the outside, and it is

limiting.

Acknowledgements:

This research essay benefited greatly from the research project I participated in for

the EU’s Digital Futures conference in June 2021. Tereza Hendl, a moral & political

philosopher, Hannah Pelikan, a human-robot interaction researcher, and I were

commissioned by Goethe Institute & Polis180 to present on AI and ¿nclusion

(across refusal towards justice). We created a gather.town virtual storage

room/exhibition that demystifies artificial intelligence under three titles, two of

which being ‘beyond inclusion: towards AI just!ce. ¿nclusion>>refusal’ and ‘AI & tech

4 whom by(e) whom: (disembodied bodies (exploited’. Hendl and Pelikan

introduced me to many of the references used in this essay. Link to the space:
https://gather.town/app/arvKjVMnmrBie3mJ/AI%20for%20whom%20by%20whom

This essay also benefited greatly from the six-week course offered by the New

York-based writer and curator Moses Serubiri where we close-read The

Undercommons: Fugitive Planning & Black Study by Fred Moten and Stefano

Harney. In the discussions we explored the undercommons subjectivity and the

question about representation.

29
Reference list

@martymoment (2020). CV Dazzle: Hide from Facial Recognition. [online] www.instagram.com.


Available at: https://www.instagram.com/tv/CA2xbn3H_pX/ [Accessed 20 Jul. 2021].

@noname (2020). https://twitter.com/noname/status/1267918514268114944. [online] Twitter.


Available at: https://twitter.com/noname/status/1267918514268114944?s=20 [Accessed 7 Jun. 2020].

Adkins-Jackson, A.J., Brown, A.F. and Loeb, T.B. (2021). No internet, no vaccine: How lack of internet
access has limited vaccine availability for racial and ethnic minorities. [online] The Conversation.
Available at:
https://theconversation.com/no-internet-no-vaccine-how-lack-of-internet-access-has-limited-vaccine
-availability-for-racial-and-ethnic-minorities-154063.

Amazon (2021). Amazon’s Big Game Commercial: Alexa’s Body. [online] YouTube. Available at:
https://youtu.be/xxNxqveseyI [Accessed 5 Jul. 2021].

Antidote (2017). Metaphorical Gender in English: Feminine Boats, Masculine Tools and Neuter Animals.
[online] www.antidote.info. Available at:
https://www.antidote.info/en/blog/reports/metaphorical-gender-english-feminine-boats-masculine-t
ools-and-neuter-animals [Accessed 17 Aug. 2021].

Awere, H. (2020). How Is the Use of Predictive Analytics in the Criminal Justice System Negatively
Impacting Black…. [online] Medium. Available at:
https://medium.com/swlh/how-is-the-use-of-predictive-analytics-in-the-criminal-justice-system-negat
ively-impacting-black-98af030a0fc0 [Accessed 15 Aug. 2021].

Bailey, M. (n.d.). The Diigitals Muses // Misty. [online] The Diigitals. Available at:
https://www.thediigitals.com/misty [Accessed 7 Sep. 2021].

Benjamin, R. (2021). Which Humans? Innovation, Equity, and Imagination in Human-Centered Design
(Keynote). [online] www.youtube.com. Available at: https://youtu.be/kDcz44ifdQw?t=1287 [Accessed
7 Jul. 2021].

Brathwaite-Shirley, D. (2017). Daniel Brathwaite-Shirley. [online] Shades Of Noir. Available at:


https://www.google.com/url?q=https://shadesofnoir.org.uk/daniel-brathwaite-shirley/&sa=D&source
=editors&ust=1631014249394000&usg=AOvVaw0wZRby6KoKzfV2uq5pp65v [Accessed 7 Aug. 2021].

30
Brathwaite-Shirley, D. (2020a). Gaming, Visibility and Black Trans Experience: An Interview with Danielle
Brathwaite-Shirley. [online] 14 Aug. Available at:
https://www.berlinartlink.com/2020/08/14/gaming-visibility-and-black-trans-experience-an-interview
-with-danielle-brathwaite-shirley/.

Brathwaite-Shirley, D. (2020b). The artist using video gaming to tackle the erasure of Black Trans stories.
[online] Available at: https://london.sciencegallery.com/news/gaming-black-trans-stories.

Britannica (2019). Avatar | Hinduism | Britannica. In: Encyclopædia Britannica. [online] Available at:
https://www.britannica.com/topic/avatar-Hinduism.

Buolamwini, J. (2018). Gender Shades. [online] Gendershades.org. Available at:


http://gendershades.org/.

Buolamwini, J. (2019). The Coded Gaze: Bias in Artificial Intelligence | Equality Summit. [online]
www.youtube.com. Available at: https://youtu.be/eRUEVYndh9c?t=615 [Accessed 25 Jul. 2021].

CNN (2013). Meet the real woman behind the voice of Siri. [online] www.youtube.com. Available at:
https://youtu.be/z2bTymnb1uE [Accessed 7 Aug. 2021].

Crawford, K. and Joler, V. (2018). Anatomy of an AI System. [online] Anatomy of an AI System. Available
at: https://anatomyof.ai/.

Diversify Our Narrative Campaign (2021). Techno-Orientalism: The Hyper-Futuristic Perception of East
Asia in Science Fiction Media. [online] www.youtube.com. Available at:
https://youtu.be/l0X15iqjYn0?t=306 [Accessed 9 Jul. 2021].

Duhaime-Ross, A. (2019). The guy who created the world’s first digital supermodel says actual people
“will become heirlooms.” [online] www.vice.com. Available at:
https://www.vice.com/en/article/j5wdxp/the-guy-who-created-the-worlds-first-digital-supermodel-sa
ys-actual-people-will-become-heirlooms.

Fessler, L. (2017). We tested bots like Siri and Alexa to see who would stand up to sexual harassment.
[online] Quartz. Available at:
https://qz.com/911681/we-tested-apples-siri-amazon-echos-alexa-microsofts-cortana-and-googles-g
oogle-home-to-see-which-personal-assistant-bots-stand-up-for-themselves-in-the-face-of-sexual-har
assment/.

Fondation Brocher (n.d.). Evénements à venir. [online] www.brocher.ch. Available at:


https://www.brocher.ch/mobile/event/421/ [Accessed 15 Aug. 2021].

​Fred Moten in conversation with Sondra Perry, (2018). Frieze. 10 Dec. Available at:
https://www.podchaser.com/podcasts/frieze-464709/episodes/fred-moten-in-conversation-wit-3454
0003.

31
Hart, T. (2020). Dining on trauma: Danielle Brathwaite-Shirley talks trans-tourism, motherhood, &
being a “Freaky Friday everyday” | | atractivoquenobello. [online] www.aqnb.com. Available at:
https://www.aqnb.com/2020/08/10/dining-on-trauma-danielle-brathwaite-shirley-on-trans-tourism-
motherhood-and-being-a-freaky-friday-everyday/ [Accessed 8 Sep. 2021].

Hanley, D. (2021). New Research: Amazon Continues to Radically Expand Worker Surveillance Practices
That Predominantly Harm People of Color. [online] Open Markets Institute. Available at:
https://www.openmarketsinstitute.org/publications/new-research-amazon-continues-to-radically-ex
pand-worker-surveillance-practices-that-predominantly-harm-people-of-color.

Harvey, A. (2010). CV Dazzle: Camouflage from Face Detection. [online] cvdazzle.com. Available at:
https://cvdazzle.com/.

Hay, C. (2021). Long Read: Blackphishing, white lies. [online] 1 Granary. Available at:
https://1granary.com/opinion/long-read-blackphishing-white-lies/ [Accessed 9 Aug. 2021].

Hernandez, P. (2019). Playing Red Dead Online as a black character means enduring racist garbage.
[online] The Verge. Available at:
https://www.theverge.com/2019/1/15/18183843/red-dead-online-black-character-racism [Accessed
7 Sep. 2021].

Holmes, H. and Burgess, G. (2020). Opinion: Coronavirus has intensified the UK’s digital divide. [online]
University of Cambridge. Available at: https://www.cam.ac.uk/stories/digitaldivide.

Hook, D. (2013). The racist bodily imaginary: The image of the body-in-pieces in (post)apartheid
culture. Subjectivity, 6(3), pp.254–271.

J, F. (2020). AI Social Influencers. [online] Shades Of Noir. Available at:


https://shadesofnoir.org.uk/ai-social-influencers/ [Accessed 10 Dec. 2020].

Jackson, C. (2011). Violence, Visual Culture, and the Black Male Body. [online] Google Books. Routledge.
Available at:
https://books.google.co.uk/books?id=LztZBwAAQBAJ&pg=PA70&lpg=PA70&dq=cassandra+jackson+t
hese+symbols+of+black+manhood+can+be+domesticated [Accessed 7 Sep. 2021].

Larson, J., Mattu, S., Kirchner, L. and Angwin, J. (2016). How We Analyzed the COMPAS Recidivism
Algorithm. [online] ProPublica. Available at:
https://www.propublica.org/article/how-we-analyzed-the-compas-recidivism-algorithm.

Manton, C. and Campbell, C. (2021). Digital Cultures Webinar 2021: 6. AI and Bias. Feminist + Queer
Perspectives. May 27, 7PM (IST). [online] www.youtube.com. Available at:
https://youtu.be/tATIPvvZ60k?t=680 [Accessed 7 Sep. 2021].

32
McLean, M.L. (2020). Facial recognition can’t tell black and brown people apart - but the police are using
it anyway. [online] gal-dem. Available at:
https://gal-dem.com/facial-recognition-racism-uk-inaccurate-met-police/.

Millar, M. (2019). Facial recognition technology struggles to see past gender binary. Reuters. [online]
30 Oct. Available at: https://www.reuters.com/article/us-usa-lgbt-facial-recognition-idUSKBN1X92OD
[Accessed 7 Jul. 2021].

Moten, F. and Harney, S. (2013). The undercommons : fugitive planning & black study. Wivenhoe Etc.:
Minor Compositions, pp.10, 19, 94.

Nakamura, L. (2002). Cybertypes : race, ethnicity and identity of the Internet. New York, N.Y. ; London:
Routledge, pp.xv, 32, 46, 137, 277.

Netizens, K. (2019). Shudu, The World’s First Digital Supermodel. [online] Parblo. Available at:
https://www.parblo.com/blogs/guides/shudu-worlds-first-digital-supermodel [Accessed 7 Jul. 2021].

Perry, S. (2018). Adrift in the chroma key blues: A chat with Sondra Perry on black radicality + things that
are yet to happen in Typhoon coming on. [online] Available at:
https://www.aqnb.com/2018/05/01/adrift-in-the-chroma-key-blues-a-chat-with-sondra-perry-on-blac
k-radicality-things-that-are-yet-to-happen-in-typhoon-coming-on/.

Quadri, T. (2021). The Future of Avatars. [online] Chimera. Available at:


https://www.chimera.news/post/the-future-of-avatars [Accessed 25 Aug. 2021].

Ravitz, J. (2013). “I’m the original voice of Siri.” [online] CNN. Available at:
https://edition.cnn.com/2013/10/04/tech/mobile/bennett-siri-iphone-voice/index.html [Accessed 7
Aug. 2021].

Rezaire, T. (2017). Deep Down Tidal. [video essay] Available at:


https://www.idfa.nl/en/film/b34ff427-21b2-4b49-b44b-cbbf6aa246f6/deep-down-tidal.

Russell, L. (2020). GLITCH FEMINISM : a manifesto. S.L.: Verso, pp.9, 24, 35, 45, 47, 73, 85, 87, 124, 147.

Russell, L. (2021). How Danielle Brathwaite-Shirley Archives the Black Trans Experience. [online]
ARTnews.com. Available at:
https://www.artnews.com/art-in-america/features/danielle-brathwaite-shirley-1234594591/
[Accessed 8 Sep. 2021].

Schaffhauser, D. (2020). Poverty, Race Linked to Lack of Internet for Students -. [online] THE Journal.
Available at:
https://thejournal.com/articles/2020/05/14/poverty-race-linked-to-lack-of-internet-for-students.aspx.

33
Settles, I.H., Buchanan, N.T. and Dotson, K. (2019). Scrutinized but not recognized: (In)visibility and
hypervisibility experiences of faculty of color. Journal of Vocational Behavior, 113, pp.62–74.

Shudu (Ama Badu ), S. (2019). What Does the Daily Routine of a “Virtual Idol” Look Like? We Spoke with 4
to Find Out. [online] HYPEBAE. 4 Jul. Available at:
https://hypebae.com/2019/7/virtual-idols-generation-noonoouri-shudu-ruby-gloom-imma-tech-influ
encers-interview [Accessed 5 Aug. 2021].

Smedley, A. and Smedley, B.D. (2005). Race as biology is fiction, racism as a social problem is real:
Anthropological and historical perspectives on the social construction of race. American Psychologist,
60(1), pp.16–26.

The Diigitals (2020). The Diigitals Models. [online] The Diigitals. Available at:
https://www.thediigitals.com/models.

The Diigitals (n.d.). THE WORLD’S FIRST ALL DIGITAL MODELLING AGENCY. [online] The Diigitals.
Available at: https://www.thediigitals.com/about.

The Economist (2021). How medicine discriminates against non-white people and women. [online] The
Economist. Available at:
https://www.economist.com/science-and-technology/2021/04/08/how-medicine-discriminates-again
st-non-white-people-and-women [Accessed 20 Aug. 2021].

Valenti, L. (2020). Can Makeup Be an Anti-Surveillance Tool? [online] Vogue. Available at:
https://www.vogue.com/article/anti-surveillance-makeup-cv-dazzle-protest.

Vincent, J. (2021). Meet the real Alexa: voice actor reportedly responsible for Amazon’s AI assistant
revealed. [online] The Verge. Available at:
https://www.theverge.com/2021/5/11/22430185/alexa-voice-actor-amazon-nina-rolle [Accessed 7
Sep. 2021].

Wark, M. (2020). Sensoria : thinkers for the twenty-first century. London: Verso.

World Civilization (n.d.). Sanskrit | World Civilization. [online] courses.lumenlearning.com. Available


at:
https://courses.lumenlearning.com/suny-hccc-worldcivilization/chapter/sanskrit/#:~:text=Sanskrit%2
0is%20the%20primary%20sacred [Accessed 7 Sep. 2021].

wysingbroadcasts.art (n.d.). SEEUSINTOUS. [online] wysingbroadcasts.art. Available at:


https://wysingbroadcasts.art/#seeusintous.

34

You might also like