Professional Documents
Culture Documents
Nearly two billion people use Facebook every month, about 1.2 billion
of them daily. The company, which Zuckerberg co-founded in his
Harvard dorm room 13 years ago, has become the largest and most
influential entity in the news business, commanding an audience
greater than that of any American or European television news
network, any newspaper or magazine in the Western world and any
online news outlet. It is also the most powerful mobilizing force in
politics, and it is fast replacing television as the most consequential
entertainment medium. Just five years after its initial public offering,
Facebook is one of the 10 highest market-capitalized public
companies in the world.
It was hard to tell how seriously Zuckerberg took the criticisms of his
service and its increasingly paradoxical role in the world. He had
spent much of his life building a magnificent machine to bring people
together. By the most literal measures, hed succeeded spectacularly,
but what had that connection wrought? Across the globe, Facebook
now seems to benefit actors who want to undermine the global vision
at its foundation. Supporters of Trump and the European right-wing
nationalists who aim to turn their nations inward and dissolve
alliances, trolls sowing cross-border paranoia, even ISIS with its
skillful social-media recruiting and propagandizing all of them
have sought in their own ways to split the Zuckerbergian world apart.
And they are using his own machine to do it.
If its an exaggeration to say that News Feed has become the most
influential source of information in the history of civilization, it is
only slightly so. Facebook created News Feed in 2006 to solve a
problem: In the social-media age, people suddenly had too many
friends to keep up with. At the time, Facebook was just a collection of
profiles, lacking any kind of central organization. To figure out what
any of your connections were up to, you had to visit each of their
profiles to see if anything had changed. News Feed fixed that. Every
time you open Facebook, it hunts through the network, collecting
every post from every connection information that, for most
Facebook users, would be too overwhelming to process themselves.
Then it weighs the merits of each post before presenting you with a
feed sorted in order of importance: a hyperpersonalized front page
designed just for you.
Facebook says its own researchers have been studying the filter
bubble since 2010. In 2015, they published an in-house study, which
was criticized by independent researchers, concluding that
Facebooks effect on the diversity of peoples information diet was
minimal. News Feeds personalization algorithm did filter out some
opposing views in your feed, the study claimed, but the bigger effect
was users own choices. When News Feed did show people views
contrary to their own, they tended not to click on the stories. For
Zuckerberg, the finding let Facebook off the hook. Its a good-
sounding theory, and I can get why people repeat it, but its not true,
he said on a call with analysts last summer.
Photo
At the time of our second interview, the manifesto was still only a
draft, and I was surprised by how unsure Zuckerberg seemed about it
in person. He had almost as many questions for us about whether
we understood what he was trying to say, how we thought it would
land in the media as we had for him. When I suggested that it
might be perceived as an attack on Trump, he looked dismayed. He
noted several times that he had been noodling over these ideas since
long before November. A few weeks earlier, there was media
speculation, fueled by a postelection tour of America by Zuckerberg
and his wife, that he was laying the groundwork to run against Trump
in 2020, and in this meeting he took pains to shoot down the rumors.
When I asked if he had chatted with Obama about the former
presidents critique of Facebook, Zuckerberg paused for several
seconds, nearly to the point of awkwardness, before answering that
he had.
The people who work on News Feed arent making decisions that turn
on fuzzy human ideas like ethics, judgment, intuition or seniority.
They are concerned only with quantifiable outcomes about peoples
actions on the site. That data, at Facebook, is the only real truth. And
it is a particular kind of truth: The News Feed teams ultimate
mission is to figure out what users want what they find
meaningful, to use Cox and Zuckerbergs preferred term and to
give them more of that.
This ideal runs so deep that the people who make News Feed often
have to put aside their own notions of whats best. One of the things
weve all learned over the years is that our intuition can be wrong a
fair amount of the time, John Hegeman, the vice president of
product management and a News Feed team member, told me.
There are things you dont expect will happen. And we learn a lot
from that process: Why didnt that happen, and what might that
mean? But it is precisely this ideal that conflicts with attempts to
wrangle the feed in the way press critics have called for. The whole
purpose of editorial guidelines and ethics is often to suppress
individual instincts in favor of some larger social goal. Facebook finds
it very hard to suppress anything that its users actions say they want.
In some cases, it has been easier for the company to seek out
evidence that, in fact, users dont want these things at all.
If a human editor ran News Feed, she would look at the clickbait
scourge and make simple, intuitive fixes: Turn down the Upworthy
knob. But Facebook approaches the feed as an engineering project
rather than an editorial one. When it makes alterations in the code
that powers News Feed, its often only because it has found some
clear signal in its data that users are demanding the change. In this
sense, clickbait was a riddle. In surveys, people kept telling Facebook
that they hated teasing headlines. But if that was true, why were they
clicking on them? Was there something Facebooks algorithm was
missing, some signal that would show that despite the clicks, clickbait
was really sickening users?
To answer these questions, Cox and his team hired survey panels of
more than a thousand paid professional raters around the world
who answer questions about how well News Feed is working. Starting
in 2013, Facebook began adding first dozens and then hundreds and
then thousands of data points that were meant to teach the artificial-
intelligence system that runs News Feed how people were reacting to
their feeds. Facebook noticed that people would sometimes click open
a clickbaity story but spend very little time on it. In other cases, lots
of people would click on a story but few would share or Like it.
Headlines on stories that people seemed to reject often contained a
set of signature phrases (youll never believe, this one trick, etc.)
or they came from a set of repeat-offender publishers.
The more such signals Facebook incorporated into the feed, the more
clickbait began to drop out of the feed. Since its 2013 peak,
Upworthys traffic has declined; it now averages 17.5 million visitors a
month. The site has since disavowed clickbait. We sort of unleashed
a monster, Peter Koechley, Upworthys co-founder, told a
conference in 2015. Sorry for that.
Cox suggested that it was not exactly correct to say that Facebook,
as a company, decided to fight clickbait. What actually happened was
that Facebook found better ways to listen to users, who were
themselves rejecting clickbait. That comes out of good-quality panels
and measurement systems, rather than an individual decision saying,
Hey, I really want us to care about clickbait, he says.
At some point, if they really want to address this, they have to say,
This is good information and This is bad information, says Emily
Bell, the director for the Tow Center for Digital Journalism at
Columbia Journalism School. They have to say, These are the kinds
of information sources that we want to privilege, and these others are
not going to be banned from the platform, but they are not going to
thrive. In other words, they have to create a hierarchy, and theyre
going to have to decide how theyre going to transfer wealth into the
publishing market.
There arent many technical reasons Facebook could not implement
such plans. The hurdles are institutional and philosophical, and
ultimately financial too. Late last year, Facebook outlined a modest
effort to curb misinformation. News Feed would now carry warning
labels: If a friend shares a viral story that has been flagged and shot
down by one of Facebooks fact-checking partners (including Snopes
and PolitiFact), youll be cautioned that the piece has been
disputed. But even that slight change has been met with fury on the
right, with Breitbart and The Daily Caller fuming that Facebook had
teamed up with liberal hacks motived by partisanship. If Facebook
were to take more significant action, like hiring human editors,
creating a reputational system or paying journalists, the company
would instantly become something it has long resisted: a media
company rather than a neutral tech platform.
In many ways, the worry over how Facebook changes the news is
really a manifestation of a grander problem with News Feed, which is
simply dominance itself. News Feeds aggressive personalization
wouldnt be much of an issue if it werent crowding out every other
source. To some degree I feel like the Pottery Barn Rule applies,
says Pariser, the Upworthy chief executive. They play a critical role
in our information circulatory system, and so lucky them all of
the problems therein are significantly on their shoulders.
60COMMENTS
But as he took the stage in San Jose, Zuckerberg was ebullient. He
started with a few dad jokes and threatened to read his long
manifesto on stage. For a brief moment, there was a shift in tone:
Statesman Zuck. In all seriousness, this is an important time to work
on building community, he said. He offered Facebooks condolences
to the victim in Cleveland; the incident, he said, reminded Facebook
that we have a lot more to do.
Sign up for our newsletter to get the best of The New York Times Magazine
delivered to your inbox every week.
A version of this article appears in print on April 30, 2017, on Page MM39 of
the Sunday Magazine with the headline: Social Insecurity. Today's Paper|
Subscribe