You are on page 1of 4

INSIGHTS

Downloaded from http://science.sciencemag.org/ on May 17, 2018


P OLICY FORUM

SOCIAL S CIENCE gated about topics such as vaccination, nu-


trition, and stock values. It is particularly

The science of fake news pernicious in that it is parasitic on standard


news outlets, simultaneously benefiting from
and undermining their credibility.
Addressing fake news requires a multidisciplinary effort Some—notably First Draft and Facebook—
favor the term “false news” because of the
use of fake news as a political weapon (1).
By David M. J. Lazer, Matthew A. Baum, and the mechanisms by which it spreads. We have retained it because of its value as a
Yochai Benkler, Adam J. Berinsky, Kelly Fake news has a long history, but we focus scientific construct, and because its politi-
M. Greenhill, Filippo Menczer, Miriam on unanswered scientific questions raised by cal salience draws attention to an impor-
J. Metzger, Brendan Nyhan, Gordon the proliferation of its most recent, politically tant subject.
Pennycook, David Rothschild, Michael oriented incarnation. Beyond selected refer-
Schudson, Steven A. Sloman, Cass R. ences in the text, suggested further reading THE HISTORICAL SETTING
Sunstein, Emily A. Thorson, Duncan J. can be found in the supplementary materials. Journalistic norms of objectivity and bal-
Watts, Jonathan L. Zittrain ance arose as a backlash among journalists
WHAT IS FAKE NEWS? against the widespread use of propaganda

T
he rise of fake news highlights the We define “fake news” to be fabricated in- in World War I (particularly their own role
erosion of long-standing institutional formation that mimics news media content in propagating it) and the rise of corporate
bulwarks against misinformation in in form but not in organizational process or public relations in the 1920s. Local and na-
the  internet age. Concern over the intent. Fake-news outlets, in turn, lack the tional oligopolies created by the dominant
problem is global. However, much news media’s editorial norms and processes 20th century technologies of information
remains unknown regarding the vul- for ensuring the accuracy and credibility of distribution (print and broadcast) sustained
ILLUSTRATION: SÉBASTIEN THIBAULT

nerabilities of individuals, institutions, and information. Fake news overlaps with other these norms. The internet has lowered the
society to manipulations by malicious actors. information disorders, such as misinforma- cost of entry to new competitors—many of
A new system of safeguards is needed. Below, tion (false or misleading information) and which have rejected those norms—and un-
we discuss extant social and computer sci- disinformation (false information that is pur- dermined the business models of traditional
ence research regarding belief in fake news posely spread to deceive people). news sources that had enjoyed high levels of
Fake news has primarily drawn recent at- public trust and credibility. General trust in
The list of author affiliations is provided in the supplementary tention in a political context but it also has the mass media collapsed to historic lows in
materials. Email: d.lazer@northeastern.edu been documented in information promul- 2016, especially on the political right, with

1094 9 MARCH 2018 • VOL 359 ISSUE 6380 sciencemag.org SCIENCE

Published by AAAS
51% of Democrats and 14% of Republicans (8). Bots are also deployed to manipulate al- structural changes in our society. Individuals
expressing “a fair amount” or “a great deal” gorithms used to predict potential engage- tend not to question the credibility of infor-
of trust in mass media as a news source (2). ment with content by a wider population. mation unless it violates their preconceptions
The United States has undergone a par- Indeed, a Facebook white paper reports wide- or they are incentivized to do so. Otherwise,
allel geo- and sociopolitical evolution. Geo- spread efforts to carry out this sort of manip- they may accept information uncritically.
graphic polarization of partisan preferences ulation during the 2016 U.S. election (5). People also tend to align their beliefs with the
has dramatically increased over the past However, in the absence of methods to values of their community.
40 years, reducing opportunities for cross- derive representative samples of bots and Research also further demonstrates that
cutting political interaction. Homogeneous humans on a given platform, any point esti- people prefer information that confirms
social networks, in turn, reduce tolerance mates of bot prevalence must be interpreted their preexisting attitudes (selective expo-
for alternative views, amplify attitudinal po- cautiously. Bot detection will always be a sure), view information consistent with
larization, boost the likelihood of accepting cat-and-mouse game in which a large, but their preexisting beliefs as more persuasive
ideologically compatible news, and increase unknown, number of humanlike bots may go than dissonant information (confirmation
closure to new information. Dislike of the undetected. Any success at detection, in turn, bias), and are inclined to accept informa-
“other side” (affective polarization) has also will inspire future  countermeasures by bot tion that pleases them (desirability bias).
risen. These trends have created a context in producers. Identification of bots will there- Prior partisan and ideological beliefs might
which fake news can attract a mass audience. fore be a major ongoing research challenge. prevent acceptance of fact checking of a
We do know that, as with legitimate news, given fake news story.
PREVALENCE AND IMPACT fake news stories have gone viral on social Fact checking might even be counterpro-
How common is fake news, and what is media. However, knowing how many indi- ductive under certain circumstances. Re-
its impact on individuals? There are sur- viduals encountered or shared a piece of fake search on fluency—the ease of information

Downloaded from http://science.sciencemag.org/ on May 17, 2018


prisingly few scientific answers to these ba- news is not the same as knowing how many recall—and familiarity bias in politics shows
sic questions. people read or were affected by it. Evalua- that people tend to remember information,
In evaluating the prevalence of fake tions of the medium-to-long–run impact on or how they feel about it, while forgetting the
news, we advocate focusing on the original political behavior of exposure to fake news context within which they encountered it.
sources—the publishers—rather than indi- (for example, whether and how to vote) are Moreover, they are more likely to accept fa-
vidual stories, because we view the defining essentially nonexistent in the literature. The miliar information as true (10). There is thus
element of fake news to be the intent and impact might be small—evidence suggests a risk that repeating false information, even
processes of the publisher. A focus on pub- that efforts by political campaigns to per- in a fact-checking context, may increase an
lishers also allows us to avoid the morass of suade individuals may have limited effects individual’s likelihood of accepting it as true.
trying to evaluate the accuracy of every single (9). However, mediation of much fake news The evidence on the effectiveness of claim
news story. via social media might accentuate its effect repetition in fact checking is mixed (11).
One study evaluating the dissemination of because of the implicit endorsement that Although experimental and survey re-
prominent fake news stories estimated that comes with sharing. Beyond electoral im- search have confirmed that the perception of
the average American encountered between pacts, what we know about the effects of me- truth increases when misinformation is re-
one and three stories from known publish- dia more generally suggests many potential peated, this may not occur if the misinforma-
ers of fake news during the month before pathways of influence, from increasing cyni- tion is paired with a valid retraction. Some
the 2016 election (3). This likely is a conser- cism and apathy to encouraging extremism. research suggests that repetition of the mis-
vative estimate because the study tracked There exists little evaluation of the impacts of information before its correction may even
only 156 fake news stories. Another study fake news in these regards. be beneficial. Further research is needed to
reported that false information on Twitter reconcile these contradictions and determine
is typically retweeted by many more people, POTENTIAL INTERVENTIONS the conditions under which fact-checking
and far more rapidly, than true informa- What interventions might be effective at interventions are most effective.
tion, especially when the topic is politics (4). stemming the flow and influence of fake Another, longer-run, approach seeks to
Facebook has estimated that manipulations news? We identify two categories of inter- improve individual evaluation of the quality
by malicious actors accounted for less than ventions: (i) those aimed at empowering of information sources through education.
one-tenth of 1% of civic content shared on the individuals to evaluate the fake news they There has been a proliferation of efforts to
platform (5), although it has not presented encounter, and (ii) structural changes aimed inject training of critical-information skills
details of its analysis. at preventing exposure of individuals to fake into primary and secondary schools (12).
By liking, sharing, and searching for infor- news in the first instance. However, it is uncertain whether such ef-
mation, social bots (automated accounts im- forts improve assessments of information
personating humans) can magnify the spread Empowering individuals credibility or if any such effects will persist
of fake news by orders of magnitude. By one There are many forms of fact checking, from over time. An emphasis on fake news might
recent estimate—that classified accounts websites that evaluate factual claims of news also have the unintended consequence of
based on observable features such as shar- reports, such as PolitiFact and Snopes, to reducing the perceived credibility of real-
ing behavior, number of ties, and linguistic evaluations of news reports by credible news news outlets. There is a great need for
features—between 9 and 15% of active Twit- media, such as the Washington Post and the rigorous program evaluation of different
ter accounts are bots (6). Facebook estimated Wall Street Journal, to contextual informa- educational interventions.
that as many as 60 million bots (7) may be tion regarding content inserted by interme-
infesting its platform. They were responsible diaries, such as those used by Facebook. Platform-based detection and intervention:
for a substantial portion of political content Despite the apparent elegance of fact Algorithms and bots
posted during the 2016 U.S. campaign, and checking, the science supporting its efficacy Internet platforms have become the most
some of the same bots were later used to at- is, at best, mixed. This may reflect broader important enablers and primary conduits of
tempt to influence the 2017 French election tendencies in collective cognition, as well as fake news. It is inexpensive to create a web-

SCIENCE sciencemag.org 9 MARCH 2018 • VOL 359 ISSUE 6380 1095


Published by AAAS
INSIGHTS | P O L I C Y F O RU M

site that has the trappings of a professional comprehensive data-collection system to should be—exercised and how to hold these
news organization. It has also been easy to provide a dynamic understanding of how massive companies to account.
monetize content through online ads and pervasive systems of fake news provision
social media dissemination. The internet are evolving. It is impossible to recreate the A FUTURE AGENDA
not only provides a medium for publishing Google of 2010. Google itself could not do so Our call is to promote interdisciplinary re-
fake news but offers tools to actively pro- even if it had the underlying code, because search to reduce the spread of fake news
mote dissemination. the patterns emerge from a complex interac- and to address the underlying pathologies
About 47% of Americans overall report get- tion among code, content, and users. How- it has revealed. Failures of the U.S. news me-
ting news from social media often or some- ever, it is possible to record what the Google dia in the early 20th century led to the rise
times, with Facebook as, by far, the dominant of 2018 is doing. More generally, researchers of journalistic norms and practices that, al-
source (13). Social media are key conduits for need to conduct a rigorous, ongoing audit of though imperfect, generally served us well by
fake news sites (3). Indeed, Russia success- how the major platforms filter information. striving to provide objective, credible infor-
fully manipulated all of the major platforms There are challenges to scientific collabo- mation. We must redesign our information
during the 2016 U.S. election, according to ration from the perspectives of industry and ecosystem in the 21st century. This effort
recent congressional testimony (7). academia. Yet, there is an ethical and social must be global in scope, as many countries,
How might the internet and social media responsibility, transcending market forces, some of which have never developed a robust
platforms help reduce the spread and impact for the platforms to contribute what data news ecosystem, face challenges around fake
of fake news? Google, Facebook, and Twitter they uniquely can to a science of fake news. and real news that are more acute than in the
are often mediators not only of our relation- The possible effectiveness of platform- United States. More broadly, we must answer
ship with the news media but also with our based policies would point to either gov- a fundamental question: How can we create
friends and relatives. Generally, their busi- ernment regulation of the platforms or a news ecosystem and culture that values and

Downloaded from http://science.sciencemag.org/ on May 17, 2018


ness model relies on monetizing attention self-regulation. Direct government regula- promotes truth? j
through advertising. They use complex statis- tion of an area as sensitive as news carries
RE FERENCES AND NOT ES
tical models to predict and maximize engage- its own risks, constitutional and otherwise.
1. C. Wardle, H. Derakhshan, “Information disorder: Toward
ment with content (14). It should be possible For instance, could regulators maintain (and, an interdisciplinary framework for research and policy
to adjust those models to increase emphasis as important, be seen as maintaining) im- making” [Council of Europe policy report DGI(2017)09,
Council of Europe, 2017]; https://firstdraftnews.com/
on quality information. partiality in defining, imposing, and enforc- wp-content/uploads/2017/11/PREMS-162317-GBR-
The platforms could provide consumers ing any requirements? Generally, any direct 2018-Report-de%CC%81sinformation-1.pdf?x29719.
with signals of source quality that could be intervention by government or the platforms 2. A. Swift, Americans’ trust in mass media sinks to new low
(Gallup, 2016); www.gallup.com/poll/195542/americans-
incorporated into the algorithmic rankings that prevents users from seeing content trust-mass-media-sinks-new-low.aspx.
of content. They could minimize raises concerns about either gov- 3. H. Allcott, M. Gentzkow, J. Econ. Perspect. 31, 211 (2017).
4. S. Vosoughi et al., Science 359, 1146 (2018).
the personalization of political in- ernment or corporate censorship. 5. J. Weedon et al., Information operations and Facebook
formation relative to other types An alternative to direct gov- (Facebook, 2017); https://fbnewsroomus.files.wordpress.
of content (reducing the creation “A new ernment regulation would be to com/2017/04/facebook-and-information-operations-v1.pdf.
6. O. Varol et al., in Proceedings of the 11th AAAI Conference on
of “echo chambers”). Functions
that emphasize currently trending
system of enable tort lawsuits alleging, for
example, defamation by those di-
Web and Social Media (Association for the Advancement of
Artificial Intelligence, Montreal, 2017), pp. 280–289.
content could seek to exclude bot safeguards rectly and concretely harmed by
7. Senate Judiciary Committee, Extremist content
and Russian disinformation online: Working with
activity from measures of what is is needed.” the spread of fake news. To the tech to find solutions (Committee on the Judiciary,
trending. More generally, the plat- extent that an online platform as- 2017); www.judiciary.senate.gov/meetings/
extremist-content-and-russian-disinformation-online-
forms could curb the automated sisted in the spreading of a mani- working-with-tech-to-find-solutions.
spread of news content by bots and cyborgs festly false (but still persuasive) story, there 8. E. Ferrara, First Monday 22, 2017 (2017).
9. J. L. Kalla, D. E. Broockman, Am. Polit. Sci. Rev. 112, 148 (2018).
(users who automatically share news from might be avenues for liability consistent with 10. B. Swire et al., J. Exp. Psychol. Learn. Mem. Cogn. 43, 1948
a set of sources, with or without reading existing constitutional law, which, in turn, (2017).
them), although for the foreseeable future, would pressure platforms to intervene more 11. U. K. H. Ecker et al., J. Appl. Res. Mem. Cogn. 6, 185 (2017).
12. C. Jones, Bill would help California schools teach about
bot producers will likely be able to design regularly. In the U.S. context, however, a pro- “fake news,” media literacy (EdSource, 2017); https://
effective countermeasures. vision of the 1996 Communications Decency edsource.org/2017/bill-would-help-california-schools-
teach-about-fake-news-media-literacy/582363.
The platforms have attempted each of Act offers near-comprehensive immunity to 13. Gottfried, E. Shearer, News use across social
these steps and others (5, 15). Facebook an- platforms for false or otherwise actionable media platforms 2017, Pew Research Center, 7
nounced an intent to shift its algorithm to statements penned by others. Any change to September 2017; www.journalism.org/2017/09/07/
news-use-across-social-media-platforms-2017/.
account for “quality” in its content curation this legal regime would raise thorny issues 14. E. Bakshy et al., Science 348, 1130 (2015).
process. Twitter announced that it blocked about the extent to which platform content 15. C. Crowell, Our approach to bots & misinformation,
Twitter, 14 June 2017; https://blog.twitter.com/official/
certain accounts linked to Russian misinfor- (and content-curation decisions) should be en_us/topics/company/2017/Our-Approach-Bots-
mation and informed users exposed to those subject to second-guessing by people alleging Misinformation.html.
accounts that they may have been duped. injury. The European “right to be forgotten” ACKNOWL EDGMENTS
However, the platforms have not provided in search engines is testing these issues. We acknowledge support from the Shorenstein Center at the
enough detail for evaluation by the research Structural interventions generally raise Harvard Kennedy School and the NULab for Texts, Maps, and
Networks at Northeastern University. D.M.J.L. acknowledges
community or subjected their findings to legitimate concerns about respecting private support by the Economic and Social Research Council ES/
peer review, making them problematic for enterprise and human agency. But just as the N012283/1. D.M.J.L. and M.A.B. contributed equally to this
use by policy-makers or the general public. media companies of the 20th century shaped article. Y.B. is on the advisory board of the Open Science
Foundation. C.R.S. has consulted for Facebook. K.M.G. acknowl-
We urge the platforms to collaborate with the information to which individuals were
edges support by the National Endowment for the Humanities.
independent academics on evaluating the exposed, the far-more-vast internet oligopo-
scope of the fake news issue and the design lies are already shaping human experience SUPP LEMENTARY MATE RIA LS
www.sciencemag.org/content/359/6380/1094/suppl/DC1
and effectiveness of interventions. There is on a global scale. The questions before us are
little research focused on fake news and no how those immense powers are being—and 10.1126/science.aao2998

1096 9 MARCH 2018 • VOL 359 ISSUE 6380 sciencemag.org SCIENCE

Published by AAAS
The science of fake news
David M. J. Lazer, Matthew A. Baum, Yochai Benkler, Adam J. Berinsky, Kelly M. Greenhill, Filippo Menczer, Miriam J.
Metzger, Brendan Nyhan, Gordon Pennycook, David Rothschild, Michael Schudson, Steven A. Sloman, Cass R. Sunstein,
Emily A. Thorson, Duncan J. Watts and Jonathan L. Zittrain

Science 359 (6380), 1094-1096.


DOI: 10.1126/science.aao2998

Downloaded from http://science.sciencemag.org/ on May 17, 2018


ARTICLE TOOLS http://science.sciencemag.org/content/359/6380/1094

SUPPLEMENTARY http://science.sciencemag.org/content/suppl/2018/03/07/359.6380.1094.DC1
MATERIALS

RELATED http://science.sciencemag.org/content/sci/359/6380/1146.full
CONTENT

REFERENCES This article cites 7 articles, 2 of which you can access for free
http://science.sciencemag.org/content/359/6380/1094#BIBL

PERMISSIONS http://www.sciencemag.org/help/reprints-and-permissions

Use of this article is subject to the Terms of Service

Science (print ISSN 0036-8075; online ISSN 1095-9203) is published by the American Association for the Advancement of
Science, 1200 New York Avenue NW, Washington, DC 20005. 2017 © The Authors, some rights reserved; exclusive
licensee American Association for the Advancement of Science. No claim to original U.S. Government Works. The title
Science is a registered trademark of AAAS.

You might also like