You are on page 1of 10

7/14/2021 Facebook needs 30,000 of its own content moderators, says a new report | MIT Technology Review

We use cookies to offer you a better browsing experience, analyze Subscribe


site traffic, personalize content, and serve targeted advertisements.
✓ Accept Cookies
By using our website, you agree to the use of cookies as described in
our Cookie Statement. Subscribe

Silicon Valley / Facebook

Facebook needs 30,000 of its own content


moderators, says a new report
by Charlotte Jee June 8, 2020

Associated Press
ASSOCIATED PRESS

Imagine if Facebook stopped moderating its site right now. Anyone could post anything they
wanted. Experience seems to suggest that it would quite quickly become a hellish environment
overrun with spam, bullying, crime, terrorist beheadings, neo-Nazi texts, and images of child
sexual abuse. In that scenario, vast swaths of its user base would probably leave, followed by
the lucrative advertisers.

But if moderation is so important, it isn’t treated as such. The overwhelming majority of the
15,000 people who spend all day deciding what can and can’t be on Facebook don’t even work
2 free
for stories remaining
Facebook. The whole function of content moderation is farmed out to third-party
Sign
in Subscribevendors,
who employ temporary workers on precarious contracts at over 20 sites worldwide. They have
https://www.technologyreview.com/2020/06/08/1002894/facebook-needs-30000-of-its-own-content-moderators-says-a-new-report/ 1/10
7/14/2021 Facebook needs 30,000 of its own content moderators, says a new report | MIT Technology Review

to review hundreds of posts a day, many of which are deeply traumatizing. Errors are rife,
We use cookies to offer you a better browsing experience, analyze
despite the company’s adoption of AI tools to triage posts according to which requireSubscribe
attention.
site traffic, personalize content, and serve targeted advertisements.
Facebook has itself admitted to a 10% error rate, whether that’s incorrectly✓ Accept
flagging posts to be
Cookies
By using our website, you agree to the use of cookies as described in
taken
ourdown
Cookiethat should be kept up or vice versa. Given that reviewers have to wade through
Statement.
three million posts per day, that equates to 300,000 mistakes daily. Some errors can have
deadly effects. For example, members of Myanmar’s military used Facebook to incite genocide
against the mostly Muslim Rohingya minority in 2016 and 2017. The company later admitted it
failed to enforce its own policies banning hate speech and the incitement of violence.

If we want to improve how moderation is carried out, Facebook needs to bring content
moderators in-house, make them full employees, and double their numbers, argues a new
report from New York University’s Stern Center for Business and Human Rights.

“Content moderation is not like other outsourced functions, like cooking or cleaning,” says
report author Paul M. Barrett, deputy director of the center. “It is a central function of the
business of social media, and that makes it somewhat strange that it’s treated as if it’s
peripheral or someone else’s problem.”

Why is content moderation treated this way by Facebook’s leaders? It comes at least partly
down to cost, Barrett says. His recommendations would be very costly for the company to
enact—most likely in the tens of millions of dollars (though to put this into perspective, it
makes billions of dollars of profit every year). But there’s a second, more complex, reason. “The
activity of content moderation just doesn’t fit into Silicon Valley’s self-image. Certain types of
activities are very highly valued and glamorized—product innovation, clever marketing,
engineering … the nitty-gritty world of content moderation doesn’t fit into that,” he says.

He thinks it’s time for Facebook to treat moderation as a central part of its business. He says
that elevating its status in this way would help avoid the sorts of catastrophic errors made in
Myanmar, increase accountability, and better protect employees from harm to their mental
health.

It seems an unavoidable reality that content moderation will always involve being exposed to
some horrific material, even if the work is brought in-house. However, there is so much more
the company could do to make it easier: screening moderators better to make sure they are
truly aware of the risks of the job, for example, and ensuring they have first-rate care and
counseling available. Barrett thinks that content moderation could be something all Facebook
employees are required to do for at least a year as a sort of “tour of duty” to help them
understand the impact of their decisions.

 The report makes eight recommendations for Facebook:

Stop outsourcing content moderation and raise moderators’ station in the workplace.
Double the number of moderators to improve the quality of content review.
2 free stories remaining Sign
in Subscribe

https://www.technologyreview.com/2020/06/08/1002894/facebook-needs-30000-of-its-own-content-moderators-says-a-new-report/ 2/10
7/14/2021 Facebook needs 30,000 of its own content moderators, says a new report | MIT Technology Review

Hire someone to oversee content and fact-checking who reports directly to the CEO or
We use cookies to offer you a better browsing experience, analyze Subscribe
COO.
site traffic, personalize content, and serve targeted advertisements.
✓ Accept Cookies
By using our
Further website,
expand you agreein
moderation toat-risk
the usecountries
of cookiesinasAsia,
described in and elsewhere.
Africa,
our Cookie Statement.
Provide all moderators with top-quality, on-site medical care, including access to
psychiatrists.
Sponsor research into the health risks of content moderation, in particular PTSD.
Explore narrowly tailored government regulation of harmful content.
Significantly expand fact-checking to debunk false information.

The proposals are ambitious, to say the least. When contacted for comment, Facebook would
not discuss whether it would consider enacting them. However, a spokesperson said its current
approach means “we can quickly adjust the focus of our workforce as needed,” adding that “it
gives us the ability to make sure we have the right language expertise—and can quickly hire in
different time zones—as new needs arise or when a situation around the world warrants it.”

But Barrett thinks a recent experiment conducted in response to the coronavirus crisis shows
change is possible. Facebook announced that because many of its content moderators were
unable to go into company offices, it would shift responsibility to in-house employees for
checking certain sensitive categories of content.

“I find it very telling that in a moment of crisis, Zuckerberg relied on the people he trusts: his
full-time employees,” he says. “Maybe that could be seen as the basis for a conversation within
Facebook about adjusting the way it views content moderation.”

Share Link

Author Charlotte Jee

Social media Jul 13

Welcome to TikTok’s endless cycle of


censorship and mistakes
TikTok bugs keep frustrating the app's marginalized users. It's a familiar problem.

2 free stories remaining Sign


in Subscribe

https://www.technologyreview.com/2020/06/08/1002894/facebook-needs-30000-of-its-own-content-moderators-says-a-new-report/ 3/10
7/14/2021 Facebook needs 30,000 of its own content moderators, says a new report | MIT Technology Review

We use cookies to offer you a better browsing experience, analyze Subscribe


site traffic, personalize content, and serve targeted advertisements.
✓ Accept Cookies
By using our website, you agree to the use of cookies as described in
our Cookie Statement.

01. 02. 03.


TikTok changed the shape of some The beauty of TikTok’s secret, surprising, Why social m
people’s faces without asking and eerily accurate recommendation content in th
Jun 10 algorithms Nov 06
Feb 24

Climate change 1 hour

The lurking threat to solar power’s growth


Plummeting sunny day solar prices are undermining the economic case to build more solar farms – and
putting climate goals at risk.

Computing 20 hours

The world’s biggest ransomware gang just


disappeared from the internet
The shutdown comes one day before US and Russian officials meet to talk about the ransomware
crisis.

2 free stories remaining Sign


in Subscribe

https://www.technologyreview.com/2020/06/08/1002894/facebook-needs-30000-of-its-own-content-moderators-says-a-new-report/ 4/10
7/14/2021 Facebook needs 30,000 of its own content moderators, says a new report | MIT Technology Review

We use cookies to offer you a better browsing experience, analyze Subscribe


site traffic, personalize content, and serve targeted advertisements.
✓ Accept Cookies
By using our website, you agree to the use of cookies as described in
our Cookie Statement.

Opinion 1 day

Why I’m a proud solutionist


History teaches us how to be brutally honest about a problem and yet optimistic for a technological
solution.

Biotechnology 1 day

Here’s what we know about kids and long covid


Children who contract covid-19 can have symptoms that persist for weeks or even months, but it’s not
clear how frequently this occurs or which kids are at risk.

2 free stories remaining Sign


in Subscribe

https://www.technologyreview.com/2020/06/08/1002894/facebook-needs-30000-of-its-own-content-moderators-says-a-new-report/ 5/10
7/14/2021 Facebook needs 30,000 of its own content moderators, says a new report | MIT Technology Review

The Change issue Jul 12


We use cookies to offer you a better browsing experience, analyze Subscribe

Tech's new labor movement is


site traffic, personalize content, and serve targeted advertisements.
✓ Accept Cookies
By using our website, you agree to the use of cookies as described in

harnessing lessons learned a century


our Cookie Statement.

ago
Workers across the tech industry are trying out a variety of techniques to pressure
their employers to make change. Some may fail, but the movement won’t.

SPONSORED

Cybersecurity can protect data. How about


elevators?
The enterprise attack surface is broad—sensors, devices, and cloud services connected
to facilities and real estate. That leaves companies vulnerable, but AI can help.

In association with Cortex Xpanse by Palo Alto Networks

Space 2 days

Richard Branson just flew to the edge of space.


2 free stories remaining Sign
in Subscribe

https://www.technologyreview.com/2020/06/08/1002894/facebook-needs-30000-of-its-own-content-moderators-says-a-new-report/ 6/10
7/14/2021 Facebook needs 30,000 of its own content moderators, says a new report | MIT Technology Review

Here’s what it means for space travel.


We use cookies to offer you a better browsing experience, analyze Subscribe
Can site traffic,worthwhile
anything personalize content,
come and
out of theserve targeted
childish advertisements.
race among billionaires to be first?
✓ Accept Cookies
By using our website, you agree to the use of cookies as described in
our Cookie Statement.

Climate change Jul 10

How hot is too hot for the human body?


Climate change is bringing extreme heat and testing the limits of what people can tolerate.

Why the grid is ready for fleets of electric trucks

More data. Deeper analysis.


Get exclusive research reports.
Gain board-room ready insights from the world's
foremost voice on technology when you purchase a
premium subscription today.

Subscribe Now

2 free stories remaining Sign


in Subscribe

https://www.technologyreview.com/2020/06/08/1002894/facebook-needs-30000-of-its-own-content-moderators-says-a-new-report/ 7/10
7/14/2021 Facebook needs 30,000 of its own content moderators, says a new report | MIT Technology Review

Viewexperience,
We use cookies to offer you a better browsing more analyze Subscribe
site traffic, personalize content, and serve targeted advertisements.
✓ Accept Cookies
By using our website, you agree to the use of cookies as described in
our Cookie Statement.

Our mission is to bring about better-informed and more conscious decisions


about technology through authoritative, influential, and trustworthy
journalism.

Subscribe to support our journalism.

About us Help & FAQ

Careers My subscription

Custom content Editorial guidelines

Advertise with us Privacy policy

International Editions Cookie statement

Republishing Terms of Service

MIT News Contact us

Cover Art by Sophy Hollington


© 2021 MIT Technology Review

Back to top

2 free stories remaining Sign


in Subscribe

https://www.technologyreview.com/2020/06/08/1002894/facebook-needs-30000-of-its-own-content-moderators-says-a-new-report/ 8/10
7/14/2021 Facebook needs 30,000 of its own content moderators, says a new report | MIT Technology Review

We use cookies to offer you a better browsing experience, analyze Subscribe


site traffic, personalize content, and serve targeted advertisements.
✓ Accept Cookies
By using our website, you agree to the use of cookies as described in
our Cookie Statement.

2 free stories remaining Sign


in Subscribe

https://www.technologyreview.com/2020/06/08/1002894/facebook-needs-30000-of-its-own-content-moderators-says-a-new-report/ 9/10
7/14/2021 Facebook needs 30,000 of its own content moderators, says a new report | MIT Technology Review

https://www.technologyreview.com/2020/06/08/1002894/facebook-needs-30000-of-its-own-content-moderators-says-a-new-report/ 10/10

You might also like