Week 2 - Technology and Platform Politics
Power and authority (Usually complex and embedded in other social/technical
systems), Platforms do not disintermediate (they are not neutral brokers, They
mediate in a different way)
Morality and politics – example of self-driving cars
Two become tied together people who can afford cars are rich/privileged.
To prioritize life of driver/pedestrian = prioritize wealthier/poorer
o Note: no driver would buy a car that won’t prioritize them ie protect
drier obvious choice to manufacturers
Politics ingrained into today’s world
(technological) Politics
Visibility vs. invisibility
Accessibility vs. inaccessibility
Authority, status
Inclusion vs. exclusion
Popularity and desirability
Normality vs. controversy
Definition 1: arrangement of power/authority in human associations as
well as the activities that take place within those arrangements
Things don’t have politics, people do – the artifacts and tech don’t matter,
what matters is the social/economic system it is embedded in
o Since this is the case, social determination of tech = plain old social
determination
How artifacts become politic
o Technical arrangements as forms of order (see tech innovations as
legislations)
Technical arrangement precedes the use of the things.
Technology is used to enhance power, authority, and privilege
politics come from the way design relates to power
tech can represent and extent existing arrangements of
power
Tech seems like neutral tools but could be designed in a way
that produce consequences prior to its professed uses. Thus it is
a way of building order in our world
Choices in design can have consequences for social struggles.
As decisions are made, different people end up possessing
unequal degree of powers
Tech designed to favor certain social interests, some receive
better hand than others
During invention/designing, system/device becomes a way of
settling an issue in a particular community
Eg: long island new York bridges, Built a series of low bridges
over the roads in Long Island leading to its beaches. These
bridges allowed cars but not buses – for a pleasurable driving
experience (Robert Moses master builder of roads parks and
public works - social class bias, racist) – his public works that
favor automobile will continue to shape the city LESS
ACCESSIBLE TO THE POOER
Architecture, city planning, public work etc can contain implicit or
explicit political purposes
Technologies may not be introduced to increase efficiency (may
have ulterior motive)
Eg: Cyrus McCormick founder of McCormick Harvesting
Machine Company added molding machines (new and
untested tech) to his manufacturing plant. Machinese
produced inferior casting at a higher cost. Purpose:
destruct the union. Wanted to weed out skilled workers
who were organizing union local in Chicago. Machine
manned by unskilled labor. Machines abandoned after 3
years.
o Definition 2: Inherently political technologies
manmade system that requires/is compatible with political
relationships
practical necessity (social conditions needed to operate
technical system) eg: worker running cotton-spinning
mills have to work regular hours and subordinate their
individual wills to the man running the factory. Workers
have to be subject to production schedules which are
linked to train schedules. Foreman are thereby pressed to
discipline workers so that they can achieve production
units on time.
Lewis Mumford brought up example of solar panel and
nuclear energy
strongly compatible with eg: decentralizing in nature solar
system compatible with democratic society since
accessibility and controllability better when solar energy
is disaggregated (分散) Solar power by contrast is
decentralising. It’s easier, if expensive, to install solar
power at a small scale, and solar reduces people’s
reliance on a centralised grid. Solar power is therefore a
more democratic technology.
eg: atom bomb hierarchical chain of command for safety,
prevent unpredictable use of it. Nuclear energy, he
argued, is inherently centralising because of the way a
nuclear power station has to be designed, built, and
maintained, and the way its power is distributed.
certain tech don’t have flexibility in the way it is built thus
choosing it equates to choosing a particular form of political life
those in power justify strong authority on the basis of
supposedly necessary conditions of technical practice
INTENTION
politics boils down to intention but even if unintended, tech can be political
Corporate power
Large companies are not monopolizing the market since they have
competitors cannot be targeted with antitrust regulation. But they hold
immense power
o Monopolies: “evil” because they fix prices and keep competitors out of
the market
o BUT modern platforms like google and amazon are not doing that they
keep prices low, makes purchases convenient, and they do not actively
work to keep competitors out
o Old solution (for socially necessary companies eg electricity, public
transport): public utilities – monopoly that is tightly regulated by public
sector
o But ^ don’t work anymore cause gov wants to attract businesses +
don’t know who/what sector to put under public utility + difficult to
regulate price in fair and transparent manner ie: need new legal
framework to understand how brokers operate
Information economy platforms
o Large platforms are clearly not just agents for information transfer they
can organize what users see eg: google underwent lawsuit for
allegedly breaking antitrust laws - advertising sales manipulation,
claimed Google misled publishers and advertisers about the price and
process of advertising auctions. Another eg: google fined 2.7billion for
eu antitrust violations over shopping searches. Lacks transparency in
how their search result algorithm works. Selection of sites necessarily
entails exclusion. Most users do not look beyond the first page of
search results
o Not neutral, shapes what we see
Disintermediation
o Definition: the technical removal of cumbersome / inefficient
intermediaries in the supply chain
Eg: amazon author can self-publish instead of relying on
publisher, the technical removal of cumbersome / inefficient
intermediaries in the supply chain
o aesthetic and affective sensibility of direct connection between
suppliers and buyers
feel connect to seller/purchaser eg: Grab, ‘live’ display of the
driver coming to pick you up or can text grab driver
Common carrier
o Common carrier definition: Traditionally refers to a person or a
company who transports goods or people for others
o For telecommunication: a common carrier is an entity that carries
signals and content but cannot favor some content over others (e.g.
ISP) need to be neutral cannot discriminate E.g. the telephone
operators cannot interrupt and say "this conversation is inappropriate"
Safe harbor
o Intermediaries are not liable for things passed through the platform
o Platforms are neutral brokers, middleman, an in-between
o Eg: youtube cannot be sued if someone posts an inappropriate video
o safe harbor provision is provided with the expectation that platforms
can responsibly govern themselves
o protect platforms with believe that they can regulate themselves
o eg: tiktok made commitment to transparency, particularly when it
comes to how we moderate and recommend content
Dangers
o Middleman position (connect buyer and seller) can be used to
threaten/have leverage over producers.
Companies are replacing traditional middlemen (eg:
wholesalers, local cab companies)
Can encourage and discourage sales
Can block sellers access to vast user base
Eg: case of amazon and Hachette (wanted to set price for their
own ebook, amazon didn’t want that and refused to accept
preorder for this publisher’s book) Amazon subjected books
published by Hachette to artificial purchase delays
Are tech companies using their power legitimately? They have
power over speech and content. Eg: cloudflare ceo mentioned in
interview, scary that tech companies can make content or
regulatory decisions
Eg: Spotify remove white power music, apple pay don’t work for
far-right merchandise
Eg: In 2020, the Justice Department filed a civil antitrust suit
against Google for monopolizing search and search advertising.
As a result of its illegal monopoly, and by its own estimates,
Google pockets on average more than 30% of the advertising
dollars that flow through its digital advertising technology
products. Google’s anticompetitive conduct has suppressed
alternative technologies, hindering their adoption by publishers,
advertisers, and rivals.
o Companies are online, can sidestep regulations (eg: licensing safety,
wage and labor)
Eg: uber drivers, less bargaining power since app is crucial link
to customers. Have to accept uber’s fee structure while not have
workers protection like taxi companies. Also, uber has little legal
responsibility - Uber had knowingly rented out the faulty Vezels
to its drivers faulty Honda Vezels had fire prone parts (in
Singapore)
Eg: Airbnb alleged allows largescale landlords to pretend to be
individuals renting room so they can avoid hotel regulations and
tax
Eg; in Japan, there is strict and peculiar rules for hotel. dictate
everything from the length of reception desks to the colour of
pillow cases. Airbnb allowed real estate analyst Aileen Jeffery to
sidestep these regulations.
Once a platform reaches a critical mass of consumers, producers, or both,
these groups become vulnerable to the platform’s control over standards
and policies. 任人宰割
o Critical mass: Platforms lean towards a monopolistic structure due to
network effects
o hard to challenge a dominant platform
o Platforms are corporate entities that can impose their own terms of use
Politics and technology
Politics definition: “arrangements of power and authority in human
associations as well as the activities that take place within those
arrangements.”
Not about government but about POWER
Importance of it
o Technologies are involved in “building order in our world.”
o Classification and arrangement of people and things
o Structures privilege, bias, discrimination, power, authority
Politics vs politics defined as power
The act of voting Who is allowed to vote
(exclusion of certain peoples as
citizens)
How news media can change political How Google makes it harder for you to
opinion chance upon some news media sites
(some ideas become inaccessible)
Institution of government policy Whose interest does the policy
advance?
(Whose interests are represented?
Whose interests are controversial or
unspeakable?)
Week 4 - Communicating through Technologies
Media ideologies (How you believe it should be used)
Definition: a set of beliefs about communication technologies with which users
and designers explain perceived media structure and meaning
o people’s beliefs about how a medium communicates and structures
communication – subjective
o Cannot be true or false – just a believe
o “Media ideology” is an analytical tool for understanding ways in which
all communication is socially constructed and socially interpreted
o Nothing inherent in a media that makes it formal informal – it all
depends on people’s media ideology
o Unstated, not conscious, come into existence in the moment of action
o tend to perceive media ideologies only when there is a failure in
communication
Message conveyed interpreted in a way you did not intend –
failure (clash of media ideologies will cause misunderstandings
and conflict)
Message can be the same yet understanding diff since people have different
understandings of the media
How are media ideologies made?
o Remediation
Definition: "the ways that people interlink media, suggesting that
people define every technology in terms of the other
communicative technologies available to them"
Our ideas about previous technologies can influence our
assumptions about how a new technology is to be used
Eg: Direct correlation between snail mail and email
o Comparison with other technologies
Our ideas about a technology is tacitly understood in relation to
what we can do with other technologies
What are the alternatives shapes how we understand what the
medium might be useful for/should be used for
o Second-order information
Media ideologies Responsible for how second-order information
works
Definition: information that guide you to understand how words
and statements should be interpreted
Media contains second-order information because of peoples
media ideologies
Factors that influence second-order info
Technical characteristics of a medium
o How much time info takes to travel synchronous or
a synchronous, word limit
o All these factors influence the idea of what the
medium should be used for
Paratext
o material surrounding the text that gives it additional
meanings
o The choice of font, color of font, formatting,
emoticon, hashtags, etc
o Eg: Comic sans silliness and irreverence that is
hardly suitable for serious matters. childcare and
entertainment, its usage in business or
professional settings has been criticized by many
aesthetic-conscious Internet users
Idioms of practice (your group’s habit of using it)
o Definition: : The ways people end up using technologies with a
communal flair characteristic to dialects
o Idioms of practice emerge out of collective discussions and shared
practices
o Eg: Friending people to see their wall and then unfriending them
immediately
o A group develops their own way of using media to communicate (ie
different groups may have different idioms of practice)
May be understood as socially inappropriate due to their
idiosyncratic nature (social etiquette established in one’s group
may not be accepted or tolerated by others)
o Reason 1 (for this existing): new media is new, there is still no
widespread consensus on the social rules that govern the use of
media. (communication is filled with uncertainty not enough time for
widespread consensus to form on how or if different media should be
used for different communicative tasks)
Note: over time practices can change from being idioms to being
widely accepted practices
o Reason 2: may face social dilemma when using new media and thus
may figure it out with a friend (eg: cant bare to confirm breakup on
Facebook, get friend to help click button)
New media =/= old media but blind media
Media ideologies show us that communication is both socially constructed and
socially interpreted.
Virtual Personal assistant
Gender
o biological category
o socially and culturally constructed category with meanings attached to
this biological assignment
o assume there is natural link between biology and attributes
(stereotype)
Women – the “fairer sex”; “more meticulous”; “more
communicative”; “less skilled in rigorous sciences”
Men – “stronger”; “more stoic”; “less communicative”
The computer was not initially conceived as tool for connecting people
o Eliza first chatbot – encourages reflection, psychotherapist
o people humanise their interaction with computers (as in see computer
as human)
Consumers see these objects that serve them as female, abet the prejudice
that women are considered objects
o Field of AI seen as having low gender diversity, users indulge in
traditional sexism
Alexa, female, Cortana has no gender, Siri (feminine name though) is
genderless – virtual assistant seems coy and wide-eyed, knowing and wry
o Gendered female by default or through intonation (previously)
o Women are assumed to be more compliant, and therefore better at
doing menial tasks
o These “natural” emotional skills also render a women less intelligent
and qualified
o Assumptions/stereotypes of female causes bots to be designed that
way (so that bots are associated to the positive skills/characteristic that
a women has)
o As designers design technologies to replicate our human order, they
also design the politics in.
Why need gender INCREASE ENGAGEMENT
o Anthropomorphization: Attribution of human traits, emotions, and
intentions to non-human entities
o We relate better to technologies that mimic our human world
o Female voices are thought to be warmer, more approachable and
nurturing
o Tech should speak without giving offense
induce likability, sound “sassy”
Reinforces the idea that women are unlikely to retaliate
You’re a bitch Siti response I’d blush if I could
represent and extend existing arrangements of power
(capabilities, characteristics of a women (stereotype), objectify,
“assistant” role, unlikely to retaliate, patriarchy, design and
widespread usage reenforces concept)
Note: Don’t need to have a gender to have politics of gender
tech created by human, their perspective would be more reflected in the tech they
design, Tech inherits the bias of humans which makes machines political. Can see
social ramifications of tech being sexist
promote unfair gender stereotype
Siri, Alexa, Cortana, and Google Assistant—which collectively total an
estimated 92.4% of U.S. market share for smartphone assistants—have
traditionally featured female-sounding voices.
depict and reinforce existing gender-job stereotypes
when we add a human name, face, or voice [to technology] … it reflects the
biases in the viewpoints of the teams that built it
In a 2016 survey, female engineers were twice as likely, compared to male
engineers, to report performing a disproportionate share of this clerical work
outside their job duties.
Clerical work = things like admin work, minutes
Media and online dating
Online activity highly influenced by media ideologies, tend to scrutinize people
closer when they are more uncertain about credibility of person
o Media ideology can lead people to believe that People tend to be more
lax and believe that profiles are mostly truthful. People are just Putting
the best foot forward and not deceptive
Virtual is not inferior to face to face interactions/relationships, it all depends on
how people understand relationships and what they mean
Online dating = a lot of communication uncertainty
Impression management
o Impressions given: spoken or typed in the traditional sense
o Impressions given off: nonverbal communication cues
Deep concern about credibility (are they who they say they are)
o Move (from dating app) to another communicative medium to establish
credibility
o Focus on paratext (more likely to be unintentional thus more truthful)
Misspells (indicate education and capability)
Length of texts (too long = desperate; too short = insincere)
Timing (too late)
Week 5 - Big Data and the Personalized Web
Cass Sunstein, 2001 – the filter bubble
internet’s most pernicious effect on the public sphere
Dangers of excessive personalisation
o Invisible auto propaganda
o We are imbued with our own familiar ideas and are oblivious to the
unknown
Note: google and Facebook do allow users to off most filters and written to an
unpersonalized web
Big data
Stephens-Davidowitz is making a claim about epistemology (the “definition of
knowledge”)
“Big Data reframes key questions about the constitution of knowledge, the
processes of research, how we should engage with information, and the
nature and the categorization of reality”
o Google called truth serum, author’s argument: nobody tells the truth
(embarrassed, twist truth), only way people really think about – use
Google search history
o ^data collected in naturalistic setting would make it “true”
Big data can understood through three interrelated factors
o Technology
Technologies that enables the collection and analysis of large
datasets
maximizing computation power and algorithmic accuracy to
gather, analyze, link, and compare large data sets
Data = “raw” and “unprocessed” facts
Eg: what time log in, how long you play game, what
country u playing from
Big data not new, amount of data collected increased insanely in
current day and age INCREASED CAPACITY FOR DATA
COLLECTION
o Analysis
drawing on large data sets to identify patterns in order to make
economic, social, technical, and legal claims
Old (symbolic/rule-based AI)
top down approach = designer thinks of how users may
use/interact with programme
Current (machine/deep learning)
Idea is that the computer learns from the data itself,
developing patterns in the data
Outcomes do not have to be comprehensible, and are
often black-boxed
Each outcome comes from too many data-points to
understand; it is beyond human understanding
Results are predictions and probabilities, they do not
explain why the data points make up the prediction
o Eg: A troll with 120 hours of playtime and who has
5 people on their friend list is most likely to be a
criminal
Goal: change raw data to “cooked,” “processed” data
aimed at providing trends and insights
Predictive Analytics *data becomes information can be
used for policing, use data set of criminals, let computer
recognize similarities in users + criminals so as to spot
potential criminals (see if game behavior is normal
behavior or similar to criminal behavior)
Up to humans to comprehend the why it is so. (since AI
cannot explain. As a result, there is a high chance of error
AI will just get things wrong with certainty (we dk why
they got it wrong too)
Infoglut: A situation when information becomes
overwhelming and incomprehensible
o Having moment of infoglut since presently, there is
a shift from comprehension to correlation
o Demand for actionable results (ie AI that works)
means that predictions can become so complex
that comprehensibility is compromised
John Cheney-Lippold: personal identity and algorithmic
identity will never be the same. Algorithm uninterested in
getting to know the whole of you
Soft results – ai predictions are probabilities that are
meant to be changed, constantly revised as more data is
captured
o More data = better algorithm. Big data analytics
emphasize a process of learning. If prediction is
wrong, companies will not delete algorithm, rather,
work on improving it (by feeding it more data)
o Also an issue since wrong predictions can have
consequences yet companies can just release
statement say working to improve and revise
Algorithmic bias issue
o Mythology
The belief that large data sets can offer a higher form of
intelligence that can generate insights that were previously
impossible, with the aura of truth, objectivity, and accuracy.
Impossible eg: humans cannot figure it out
Big data has offer this myth as it uses data (raw, unorganized,
unprocessed facts) and translate that into information
(processed, organized, interpreted data)
The larger your data set, the more it seems that you can arrive
at objective, accurate informative truth.
Big data problems
o Possibility:
Stereotypes
Representation
Racial profiling
Facial features as a identifier for criminal
o Baseline of problem: data set Garbage in garbage out
o 1. Claims to objectivity and accuracy are misleading
Assumes objectivity to historical data
However, Our data are already encoded with values, Our bias
produces biased data which is encoded into the big data
analysis
Eg: black, brown or Hispanic more likely to be stopped and
checked for criminal activity
Cleaning data process would def make it subjective
o 2. Bigger data are not always better data
Data = sample not population, problem of underrepresentation
Do all users use Google? Is Google overrepresented amongst
specific geographical, cultural, or linguistic populations?
poor people no money see doc so their info not captured by big
data, not captured by healthcare
Algorithm is not for the population if the data is not
representative, algorithm would be for those who contributed
data only
Eg: Because it is easy to obtain Twitter data, scholars have
used Twitter to examine a wide variety of patterns, media event
engagement, political uprisings, and conversational interactions
Problem 1: Twitter does not represent ‘all people. Some
people share account, some people for few accounts,
bots
Problem 2: While some users post content frequently
through Twitter, others participate as ‘listeners’ (Crawford
2009, p. 532). Twitter Inc. has revealed that 40 percent of
active users sign in just to listen (Twitter 2011). The very
meanings of ‘user’ and ‘participation’ and ‘active’ need to
be critically examined
o 3. Taken out of context, Big Data loses its meaning
Stephens-Davidowitz’s theory: when u search for something
means u are concerned about said thing
Possible that we: Search slight concern, Speak to people about
deep concerns
Google =/= truth serum since it may not be really representative
of your “true” desire
o 4. Just because it is accessible does not make it ethical
People often do not provide their consent and have no control
over how their data will be interpreted and used
Violations to privacy
Consent to terms of use but don’t want our data to be used
Eg; AI Art generator
o 5. Limited access to Big Data creates new digital divides
2 problems with “digital divide”
Data scientists vs non data scientists
Among data scientists = sexist, discrimination
Use of big data
o Help marketers source and identify potential customers
o Adjust promotions, wage increments and to make decisions on layoffs
o Criminal identification and prevention
o Medical diagnosis; prescription of care; preventive medicine
o Target consumers/users with a personalized browsing experience
Example of fight crime:
+ve police departments around the world continue to do this, utilizing much more
precise and accurate tools, (predpol, shotspotter, IBM i2 Coplink, Microsoft Azure
Data Lake and Watson Analytics) that learn from large datasets, including police
records, surveillance cameras and other sources to predict crime before it happens.
These technologies have not only proven effective in predicting property crime, but
also in preventing terrorist attacks, and tracking down sex offenders and financial
criminals. Some results have been impressive, as in the UK, where digital crime
maps are considered to be 10 times more effective at predicting crime than
police officers.
-ve In a report that based its findings on a 2021 study, the Vienna-based EU Agency
for Fundamental Rights (FRA) said algorithms based on poor data quality could
harm people's lives. some demographic groups may be more often connected to
"simpler crimes," which can lead to bias in law enforcement, FRA said.
On offensive speech moderation the agency found algorithms developed for
automated hate speech detection unreliable as it flagged non-offensive terms in
English alluding to "Muslim," "gay" or "Jew" as offensive.
Personalized web
Interest curation: An attempt to personalize and customize your web browsing
experience – this includes your consumption of news
o Google uses data it has of you (from history, gmail, maps, cookies),
and data of ‘people like you’ to decide what you want
News consumption
o provide you with news that you will want
o From echo chamber to the filter bubble: Information shared between
like-minded people vs. the algorithm feeding you news based on what
it thinks you want to see
o ^increase in personalisation (eg: Conservative/liberal news; stances on
a debate; nature of issues)
Filter bubble (partisanship) and filter bubble (impact on news production)
o Loss of the “front-page” unifying element
everyone think its is important on the same page, same focus ie
front page of news, connects individuals
o Creates more fragmentation and partisanship *ie bias
o Arlie Hochschild: “Empathy walls”
more and more difficult for us to see things from another
person’s worldview since we are not fed the info, cannot discuss
lack of understanding, presence of wall
empathy wall: an obstacle to deep understanding of another
person, one that can make us feel indifferent or even hostile to
those who hold different beliefs or whose childhood is rooted in
different circumstances.
Filter bubble criticism :
o People may actively seek out alternative viewpoints
o News not the main thing that unites us
Sense of togetherness can come from movie, tv and music
o Personalization may be good for you
Personalization may even be helpful in getting you to identify
good alternative viewpoints
At least you are getting news, algorithm gives you news that you
would otherwise have not known
o Who should decide on how we should think? Decisions on relevance
vs. algorithmic paternalism
Inside-Out (Political Q)
How you can ‘break out of your bubble’, be exposed to different viewpoints
Outside-In (Political Economy Q)
Filter bubbles relate to an attention economy; how do news organizations
ensure that their news articles will be clicked on?
“Digital-first” publishing model
o A situation where publishers choose to distribute information online at
the expense of traditional media
o they do not just move content from print to the web
o Such a publishing model fundamentally transforms the nature of news
production
Publish online first + all publication geared to fit how people
consume news online (vs consume news on print)
o Impacts economics of news production
Last time printed ad revenue more than paid subscription,
newspaper reliant on advertising revenue (print)
Online ad don’t bring in as much revenue. Online ads bring in
only a fraction of the revenue that print ads used to bring in
Aggregators like Google, Facebook, and Twitter bring in site
traffic but not money
People see but =/= money (digital)
Layoffs, less edits, and the push to build a stronger “digital-first”
presence
Eg: today goes digital - loss of 40 jobs, SPH cut 230 jobs
(ppt slide 111)
o Influence the way people write, and types of content produced
Emphasis on “catchiness” and “spreadability” so as to bring in
advertising dollars
Use metrics to judge a journalist’s performance
Produce articles with shorter journalistic timelines
News-subsidy organisation write news, journalist edit and
release
Printing, layout
Shift towards automated news writing software
Natural language generation (NLG) is a technology that
transforms data into clear, human-sounding narratives
(eg: automated insights used by Yahoo)
Emphasis on virality can reduce the resources dedicated to
good journalism
Digital-first changed publishing landscape.
Democracy relies on the capacity for contested viewpoints. Filter bubbles reduces
empathy for people unlike yourself, making it harder to understand each other
Week 6 – digital labour
Digital labor is an analytic used to understand the relations of power that involve the
interaction of digital technologies, work, and capitalism.
Topics of digital labor:
o Algorithmic labor – how algorithm affect work
o Creative & Technology work / Immaterial labor
o Automation
o Working-from-home
o Playbor – blur of lines between work and play
Digital presence in labour getting stronger and stronger - more and more
kinds of work are traditionally non-digital are becoming digitized eg: food
delivery (gps system)
Karl Marx
Work
Work is not natural, but a constructed social relationship
Marx is not against labor – he understands labor as a creative process by
which humans interact with their world and express their passions
Labor, however, becomes involuntary (socially necessary) when people are
dispossessed of access to the means of production
Dispossession: the action of depriving someone of land, property, or other
possessions
o Removed of the capacity of survival without working - As a result, it
becomes necessary for you to sell the energy of your body and mind
(“labor power”) to get a wage, such that you can survive and reproduce
o Privatization of the means to production (land, knowledge, tools…)
o Dismantling of social systems that make alternatives possible (barter
trading, communal systems, social institutions…)
Erasure of institutions that can ensure possibilities of survival
outside a wage makes work normative
o Penetrate work-requirements into institutional processes as goals and
norms (universities, unemployment benefits…)
Get job is not the only function of uni but that became the
primary function in a system that prioritizes work
o Render dispossession ordinary, make people unaware of this power
relations in existence
social relations of production produces class relations (proletariats –workers
vs. the capitalists – owns means of production)
o Left unchecked, this process is self-perpetuating (capitalists acquire
the surplus value, use it to get more capital
o puts burden on workers. Actions reproduce and enhances relations of
capitalism
work is socially constructed, involves relations of power
o There are social interests in place
o It means that there is exploitation involved in it, even though its
dynamics can differ
o The relations of power is obfuscated and naturalized – ie ppl don’t
question why do I have to work
Labour is modelled after
o 1800s ~ 1960s Industrial, or a Fordist regime of work (mass
production, ford motor company)
Monotony
Hardship
Alienation
o Information and creativity, or a post-Fordist regime of work
Playfulness
Autonomy (self-government)
Free labour
Tiziana Terranova used the term “free labor” in the late 1990s to discuss the
‘netslaves’ of America Online
o Recruited volunteers for “community leader” positions
o 3 to 4 hours of responsibilities a week, which included manning chat
groups, helplines, organizing fantasy sports games
o Dispute centred on the definition of “employee” and “volunteer”
o Volunteers were considered “employees” because they had to clock-in
and clock-out (monitored hours)
o Eg: Huffington post In my view, the Huffington Post's bloggers (ot paid)
have essentially been turned into modern-day slaves on Arianna
Huffington's site.
Not a new issue (housework, slavery, internship)
Labor that is “simultaneously voluntarily given and unwaged, enjoyed
and exploited” (Terranova, 2000, p. 33) – us using social media
Free labor that does not feel like exploitation at all; people may even think
they are getting a good deal
Voluntary contribution vs. employee
Platforms and the monetization of personal data
o We are producing free labor
o TV: audiences are all eyeballs for advertisers; television produces the
“audience commodity”
we think we are just consuming but actually we are already a
commodity
o The capacity (and pressure) for people to produce and upload digital
content has increased the stakes of free labor
Contribute posts to social media, contribute eyeball content,
personal data
o giving your customers stuff in exchange for their personal data, which
you then use to make money
eg: sign up free gift, promo code
o Platforms and the Monetization of Content
Our contributions add valuation to these companies
Meta: $461bn
YouTube: $23bn
WhatsApp (when bought): $19bn
ByteDance: $300bn
These companies do not charge or charge only nominal
sums for content
o Why are they valued so highly? Because many
users
o High number of users. What attracts users?
Content
o Some of them were bought over when they had
yet to have strategies for monetization. What were
people buying? Buying users
o Companies do not produce good jobs for society
Few full time payroll employees (outsource a lot eg: bytedance)
+ massive layout recently
Eg: majority of google non permanent staff are outsource
workers (temp workers employed directly by google only
make up 3% of temp staff)
Meta: $461bn, 71000 (2021)
Google (Alphabet): $1.2 trillion, 190,234 (2023)
ByteDance: $300bn, ~100,000 (-est 2023)
WhatsApp: $19bn, 55 (2014)
Grab: $12.82bn, 7000 (2022)
Very low employee to revenue ratio (rich get richer, no
redistribution of wealth)
Rather than being treated as assets that companies seek to
invest in, they have become costs to be minimized.”
Assume that: zero- sum relationship between workers
and profits… Labor costs should thus be kept to a
minimum: Employees should be paid the lowest possible
market wage
o How do companies monetize our data (make each person
identifiable, know about you and your networks, sell you as a
commodity)
Ensure the data belongs to a real, authentic, identifiable person
Facebook “you are not to provide any false personal
information on Facebook. Facebook users provide their
real names and information.”
Know more about you and your networks
Social graph: representation of your social network,
depicting both human and non-human objects (like family
tree but for social connections)
Sell you as a commodity to an advertiser
“the long click” – to know your trail from ad to purchase,
and to use that information to target someone else like
you
Opportunity vs. Exploitation (REPUTATION ECONOMY)
The reputation economy and self-promotion
o Eg 1: Viewers prefer Youtuber who posts more often, making algorithm
favor them
o Eg 2: Process of modifying existing game (to enhance it)
Do it for a chance to get into the industry + demonstrate skills
and competencies (OPPORTUNITY) but reality is that only top
mod can get noticed
o allows you to build a reputation that you can later capitalize on
o Reputation Economy – A cultural and economic system that that
has come to prioritize the online reputation as a form of capital,
giving rise to efforts of self-promotion
Built into the system as a metric
o Followers, reviews, posts, etc…
o Prominently displayed (usually next to a display name)
o Represented quantitatively (rating)
o Integrated into the mechanisms of the system (e.g. updated every time
someone follows you)
Used as indicators of reputation
o Allow for fast, heuristic comparisons of reputational capital
o Are computed through algorithms to influence visibility (e.g. through
ranks and the arrangement of listings on a search) to suggest
competence, recommendation, and relevance.
o Which in the process encourages one to see visibility as signs of
reputation and competence, believe that visibility = credibility
o Encourages people to self-manage their metrics so as to be visible,
and appear competent
Self-promotion represents the practice of managing metrics and visibility,
characteristic of online reputational systems
o to build an online reputation, people will need to promote themselves
by regularly producing high quality content, often for free.
o And “visibility” – “likes,” “shares,” “shout-outs” – are understood to the
“wage” for the contribution of free content. This is thought to boost
reputation and provide future economic opportunities.
o According to this view, the value of free promotion on a wide platform
outweighs any benefits to be gotten from the surety of a professional
pay scale
Does online reputation = economic gain
o Yes but only for some
o relying on the online reputation as a wage leads to at least three
problems
It leads to a winner-take-all market (a select few benefit
immensely from this, but most do not)
Payoff is uncertain and hours put in are substantial
As more content is produced for free, people have also come to
expect that such content ought to be offered for free, making it
harder for people to charge for content - create culture of this
Exploitation is masqueraded as opportunity
Note: Problem is not tech, is production that has a system that reinforces the culture
of exploitation
Production of jobs vs. production of bad jobs
Automation and the AI curtain
o no jobs are safe (from automation), but some are safer than others
o creative work beyond threat of ai – not true anymore
eg: stability AI, DeviantArt and midjourney sued over copyright
infringement
high skilled high education also threated by AI ChatGPT
o Jobs that AI generate
that jobs produced through algorithms tend to be invisible,
piecemeal, temporary, lowly paid, and precarious
Poorly paid, usually nonwhite workers
eg: Crowdsourcing is a good way to break down a manual, time-
consuming project into smaller, more manageable tasks to be
completed by distributed workers over the Internet BYTEDANCE
o AI Curtain
Metaphor which describes how such jobs tend to be invisible
and regarded as done by AI when it is actually done by humans
Eg: Content moderation
Meta: 15,000 content moderators around the world
(2021)
Moderators can be in-house, but they are often
contracted from data companies
Traumatic
This kind of dirty work is so invisible that general public
think it is done by ai
Content moderation – a booming position (amidst layoffs,
employment freezes, and poor economy)
Invisible labor = invisible consequence
marx
main point: : it is not just that AI introduces a new degree of precarity to labour,
Existential problem of AI is no longer just philosophical; it has pressing problems on
how people will relate to the world