Professional Documents
Culture Documents
Associated Press
ASSOCIATED PRESS
Imagine if Facebook stopped moderating its site right now. Anyone could post anything they
wanted. Experience seems to suggest that it would quite quickly become a hellish environment
overrun with spam, bullying, crime, terrorist beheadings, neo-Nazi texts, and images of child
sexual abuse. In that scenario, vast swaths of its user base would probably leave, followed by
the lucrative advertisers.
But if moderation is so important, it isn’t treated as such. The overwhelming majority of the
15,000 people who spend all day deciding what can and can’t be on Facebook don’t even work
2 free
for stories remaining
Facebook. The whole function of content moderation is farmed out to third-party
Sign
in Subscribevendors,
who employ temporary workers on precarious contracts at over 20 sites worldwide. They have
https://www.technologyreview.com/2020/06/08/1002894/facebook-needs-30000-of-its-own-content-moderators-says-a-new-report/ 1/10
7/14/2021 Facebook needs 30,000 of its own content moderators, says a new report | MIT Technology Review
to review hundreds of posts a day, many of which are deeply traumatizing. Errors are rife,
We use cookies to offer you a better browsing experience, analyze
despite the company’s adoption of AI tools to triage posts according to which requireSubscribe
attention.
site traffic, personalize content, and serve targeted advertisements.
Facebook has itself admitted to a 10% error rate, whether that’s incorrectly✓ Accept
flagging posts to be
Cookies
By using our website, you agree to the use of cookies as described in
taken
ourdown
Cookiethat should be kept up or vice versa. Given that reviewers have to wade through
Statement.
three million posts per day, that equates to 300,000 mistakes daily. Some errors can have
deadly effects. For example, members of Myanmar’s military used Facebook to incite genocide
against the mostly Muslim Rohingya minority in 2016 and 2017. The company later admitted it
failed to enforce its own policies banning hate speech and the incitement of violence.
If we want to improve how moderation is carried out, Facebook needs to bring content
moderators in-house, make them full employees, and double their numbers, argues a new
report from New York University’s Stern Center for Business and Human Rights.
“Content moderation is not like other outsourced functions, like cooking or cleaning,” says
report author Paul M. Barrett, deputy director of the center. “It is a central function of the
business of social media, and that makes it somewhat strange that it’s treated as if it’s
peripheral or someone else’s problem.”
Why is content moderation treated this way by Facebook’s leaders? It comes at least partly
down to cost, Barrett says. His recommendations would be very costly for the company to
enact—most likely in the tens of millions of dollars (though to put this into perspective, it
makes billions of dollars of profit every year). But there’s a second, more complex, reason. “The
activity of content moderation just doesn’t fit into Silicon Valley’s self-image. Certain types of
activities are very highly valued and glamorized—product innovation, clever marketing,
engineering … the nitty-gritty world of content moderation doesn’t fit into that,” he says.
He thinks it’s time for Facebook to treat moderation as a central part of its business. He says
that elevating its status in this way would help avoid the sorts of catastrophic errors made in
Myanmar, increase accountability, and better protect employees from harm to their mental
health.
It seems an unavoidable reality that content moderation will always involve being exposed to
some horrific material, even if the work is brought in-house. However, there is so much more
the company could do to make it easier: screening moderators better to make sure they are
truly aware of the risks of the job, for example, and ensuring they have first-rate care and
counseling available. Barrett thinks that content moderation could be something all Facebook
employees are required to do for at least a year as a sort of “tour of duty” to help them
understand the impact of their decisions.
Stop outsourcing content moderation and raise moderators’ station in the workplace.
Double the number of moderators to improve the quality of content review.
2 free stories remaining Sign
in Subscribe
https://www.technologyreview.com/2020/06/08/1002894/facebook-needs-30000-of-its-own-content-moderators-says-a-new-report/ 2/10
7/14/2021 Facebook needs 30,000 of its own content moderators, says a new report | MIT Technology Review
Hire someone to oversee content and fact-checking who reports directly to the CEO or
We use cookies to offer you a better browsing experience, analyze Subscribe
COO.
site traffic, personalize content, and serve targeted advertisements.
✓ Accept Cookies
By using our
Further website,
expand you agreein
moderation toat-risk
the usecountries
of cookiesinasAsia,
described in and elsewhere.
Africa,
our Cookie Statement.
Provide all moderators with top-quality, on-site medical care, including access to
psychiatrists.
Sponsor research into the health risks of content moderation, in particular PTSD.
Explore narrowly tailored government regulation of harmful content.
Significantly expand fact-checking to debunk false information.
The proposals are ambitious, to say the least. When contacted for comment, Facebook would
not discuss whether it would consider enacting them. However, a spokesperson said its current
approach means “we can quickly adjust the focus of our workforce as needed,” adding that “it
gives us the ability to make sure we have the right language expertise—and can quickly hire in
different time zones—as new needs arise or when a situation around the world warrants it.”
But Barrett thinks a recent experiment conducted in response to the coronavirus crisis shows
change is possible. Facebook announced that because many of its content moderators were
unable to go into company offices, it would shift responsibility to in-house employees for
checking certain sensitive categories of content.
“I find it very telling that in a moment of crisis, Zuckerberg relied on the people he trusts: his
full-time employees,” he says. “Maybe that could be seen as the basis for a conversation within
Facebook about adjusting the way it views content moderation.”
Share Link
https://www.technologyreview.com/2020/06/08/1002894/facebook-needs-30000-of-its-own-content-moderators-says-a-new-report/ 3/10
7/14/2021 Facebook needs 30,000 of its own content moderators, says a new report | MIT Technology Review
Computing 20 hours
https://www.technologyreview.com/2020/06/08/1002894/facebook-needs-30000-of-its-own-content-moderators-says-a-new-report/ 4/10
7/14/2021 Facebook needs 30,000 of its own content moderators, says a new report | MIT Technology Review
Opinion 1 day
Biotechnology 1 day
https://www.technologyreview.com/2020/06/08/1002894/facebook-needs-30000-of-its-own-content-moderators-says-a-new-report/ 5/10
7/14/2021 Facebook needs 30,000 of its own content moderators, says a new report | MIT Technology Review
ago
Workers across the tech industry are trying out a variety of techniques to pressure
their employers to make change. Some may fail, but the movement won’t.
SPONSORED
Space 2 days
https://www.technologyreview.com/2020/06/08/1002894/facebook-needs-30000-of-its-own-content-moderators-says-a-new-report/ 6/10
7/14/2021 Facebook needs 30,000 of its own content moderators, says a new report | MIT Technology Review
Subscribe Now
https://www.technologyreview.com/2020/06/08/1002894/facebook-needs-30000-of-its-own-content-moderators-says-a-new-report/ 7/10
7/14/2021 Facebook needs 30,000 of its own content moderators, says a new report | MIT Technology Review
Viewexperience,
We use cookies to offer you a better browsing more analyze Subscribe
site traffic, personalize content, and serve targeted advertisements.
✓ Accept Cookies
By using our website, you agree to the use of cookies as described in
our Cookie Statement.
Careers My subscription
Back to top
https://www.technologyreview.com/2020/06/08/1002894/facebook-needs-30000-of-its-own-content-moderators-says-a-new-report/ 8/10
7/14/2021 Facebook needs 30,000 of its own content moderators, says a new report | MIT Technology Review
https://www.technologyreview.com/2020/06/08/1002894/facebook-needs-30000-of-its-own-content-moderators-says-a-new-report/ 9/10
7/14/2021 Facebook needs 30,000 of its own content moderators, says a new report | MIT Technology Review
https://www.technologyreview.com/2020/06/08/1002894/facebook-needs-30000-of-its-own-content-moderators-says-a-new-report/ 10/10