The Atlantic

Facebook Doesn't Understand Itself

The company’s content moderation guide suggests it hasn’t come to grips with its unique role in the world.
Source: Robert Galbraith / Reuters

Facebook’s 2 billion users post a steady stream of baby pictures, opinions about romantic comedies, reactions to the news—and disturbing depictions of violence, abuse, and self-harm. Over the last decade, the company has struggled to come to terms with moderating that last category. How do they parse a joke from a threat, art from pornography, a cry for help from a serious suicide attempt? And even if they can correctly categorize disturbing posts with thousands of human contractors sifting through user-flagged content, what should do they about it?

This weekend, The Guardian began publishing stories based on 100 documents leaked to them from the training process that these content moderators go through. They’re calling it The Facebook Files. Facebook neither confirmed nor denied the

You're reading a preview, sign up to read more.

More from The Atlantic

The Atlantic4 min readPolitics
Seven Questions That Need Answers Before Any Attack on Iran
President Trump’s threats of retaliation for strikes on Saudi oil facilities seem premature.
The Atlantic6 min readSociety
We Need a More Targeted Approach to Combatting Global Inequality
A new trove of data may allow us to replace a trickle-down approach with more precise efforts.
The Atlantic6 min readPolitics
Let Them Fight
Democratic primary voters should have a chance to evaluate how their potential standard-bearers fare against hostile criticism.