You are on page 1of 4

See discussions, stats, and author profiles for this publication at: https://www.researchgate.

net/publication/323650280

The science of fake news

Article  in  Science · March 2018


DOI: 10.1126/science.aao2998

CITATIONS READS
109 6,772

16 authors, including:

Matthew Baum Filippo Menczer


Harvard University Indiana University Bloomington
68 PUBLICATIONS   3,035 CITATIONS    220 PUBLICATIONS   8,938 CITATIONS   

SEE PROFILE SEE PROFILE

Gordon Pennycook Steven A. Sloman


University of Regina Brown University
70 PUBLICATIONS   1,614 CITATIONS    161 PUBLICATIONS   7,644 CITATIONS   

SEE PROFILE SEE PROFILE

Some of the authors of this publication are also working on these related projects:

Social Bots View project

Creative Cognition View project

All content following this page was uploaded by Steven A. Sloman on 25 June 2018.

The user has requested enhancement of the downloaded file.


INSIGHTS
P OLIC Y FO RUM

Illustration will be sent to authors

SOCIAL SCIENCE pernicious in that it is parasitic on standard


news outlets, simultaneously benefiting from

The science of fake news and undermining their credibility.


Our definition of fake news makes no
assumptions about the characteristics of
Addressing fake news requires a multidisciplinary effort sources or amplification strategies. Some—
notably First Draft and Facebook favor “false
news” because of the use of fake news as a
By David Lazer, Matthew Baum, Yochai scientific questions raised by the prolifera- political weapon (1). We have retained it be-
Benkler, Adam Berinsky, Kelly Greenhill, tion of its most recent, politically oriented in- cause of its value as a scientific construct, and
Filippo Menczer, Miriam Metzger, carnation. Beyond selected references in the because its political salience draws attention
Brendan Nyhan, Gordon Pennycook, David text, suggested further reading can be found to an important subject.
Rothschild, Michael Schudson, Steven in the supplementary materials.
Sloman, Cass Sunstein, Emily Thorson, THE HISTORICAL SETTING
Duncan Watts, Jonathan Zittrain WHAT IS FAKE NEWS? Journalistic norms of objectivity and balance

T
We define “fake news” to be fabricated in- arose as a backlash among journalists against
he rise of fake news highlights the formation that mimics news media content the widespread use of propaganda in World
erosion of long-standing institutional in form but not in organizational process or War I (particularly their own role in propa-
bulwarks against misinformation in intent. Fake news outlets, in turn, lack the gating it) and the rise of corporate public
the  internet age. Concern over the news media’s editorial norms and processes relations in the 1920s. Local and national oli-
problem is global. However, little is for ensuring the accuracy and credibility of gopolies created by the dominant 20th cen-
known regarding fundamental issues information. Fake news overlaps with other tury technologies of information distribution
about the vulnerabilities of individuals, in- information disorders, such as misinforma- (print and broadcast) sustained these norms.
stitutions, and society to manipulations  by tion (false or misleading information) and The internet has reduced many of those con-
malicious actors. Below we discuss extant disinformation (false information that is pur- straints on news dissemination. This allows
social and computer science research regard- posely spread to deceive people). outlets that do not embody these norms to
ing belief in fake news and the mechanisms Fake news has primarily drawn recent at- compete online with those that do so more ef-
ILLUSTRATION: TK

by which it spreads. We focus on unanswered tention in a political context, but it also has fectively than was possible offline. It has con-
been documented in information promul- tributed to the abandonment of traditional
The list of author affiliations is provided in the supplementary gated about topics such as vaccination, nu- news sources that had enjoyed high levels of
materials. Email: d.lazer@neu.edu trition, and stock values. It is particularly public trust and credibility. General trust in

2 8 MARCH 2018 • VOL 359 ISSUE 6380 sciencemag.org SCIENCE


the mass media has collapsed to historic lows some of the same bots were later used to at- best, mixed. This may reflect broader tenden-
in 2016, with 51% of Democrats and 14% of tempt to influence the 2017 French election cies in collective cognition, as well as struc-
Republicans expressing “a fair amount” or “a (8). Bots are also deployed to manipulate al- tural changes in our society. Individuals tend
great deal” of trust in mass media as a news gorithms used to predict potential engage- not to question the credibility of information
source (2). ment with content by a wider population. unless it violates their preconceptions or they
The United States has had a parallel evo- Indeed, a Facebook white paper reports wide- are incentivized to do so. Otherwise, they
lution in its geo-socio-political environment. spread efforts to carry out this sort of manip- may accept information uncritically. People
Geographic polarization of partisan prefer- ulation during the 2016 U.S. election (5). also tend to align their beliefs with the values
ences has dramatically increased over the However, in the absence of methods to of their community.
past 40 years, reducing opportunities for derive representative samples of bots and Research also further demonstrates that
cross-cutting political interaction. Homoge- humans on a given platform, any point esti- people prefer information that confirms
neous social networks, in turn, reduce toler- mates of bot prevalence must be interpreted their preexisting attitudes (selective expo-
ance for alternative views, amplify attitudinal cautiously. Bot detection will always be a sure), view information consistent with their
polarization, boost the likelihood of accepting cat-and-mouse game in which a large, but preexisting beliefs as more persuasive than
ideologically compatible news, and increase unknown, number of humanlike bots may go dissonant information (confirmation bias),
closure to new information. Dislike of the undetected. Any success at detection, in turn, and are inclined to accept information that
“other side” (affective polarization) has also will inspire future  countermeasures by bot pleases them (desirability bias). Prior parti-
risen. These trends have created a context in producers. Identification of bots will there- san and ideological beliefs might prevent ac-
which fake news can attract a mass audience. fore be a major ongoing research challenge. ceptance of fact checking of a given fake news
We do know that, as with legitimate news, story.
PREVALENCE AND IMPACT fake news stories have gone viral on social Fact checking might even be counterpro-
How common is fake news, and what is its media. However, knowing how many indi- ductive under certain circumstances. Re-
impact on individuals? Solutions should be viduals encountered or shared a piece of fake search on fluency—the ease of information
commensurate to the magnitude of the prob- news is not the same as knowing how many recall—and familiarity bias in politics shows
lem, but there are surprisingly few scientific people read or were affected by it. Evaluating that people tend to remember information,
answers to these basic questions. medium-to-long–run impact on political be- or how they feel about it, while forgetting the
In evaluating the fake news prevalence, we havior of exposure to fake news (for example, context within which they encountered it.
advocate focusing on the original sources— whether and how to vote) is essentially non- Moreover, they are more likely to accept fa-
the publishers—rather than individual sto- existent in the literature. The impact might miliar information as true (10). There is thus
ries, because we view the defining element of be small—evidence suggests that efforts by a risk that repeating false information, even
fake news to be the intent and processes of political campaigns to persuade individuals in a fact-checking context, may increase an
the publisher. A focus on publishers also al- may have limited effects (9). However, me- individual’s likelihood of accepting it as true.
lows us to avoid the morass of trying to evalu- diation of much fake news via social media There is mixed evidence about effectiveness
ate the accuracy of every single news story. might accentuate its effect because of the of claim repetition in fact checking (11).
One study evaluating the dissemination of implicit endorsement that comes with shar- Although experimental and survey re-
prominent fake news stories estimated that ing. Beyond electoral impacts, what we know search have confirmed that the perception of
the average American encountered between about the effects of media more generally truth increases when misinformation is re-
one and three stories from known publish- suggests many potential pathways of influ- peated, this may not occur if the misinforma-
ers of fake news during the month before ence, from increasing cynicism and apathy tion is paired with a valid retraction. Some
the 2016 election (3). This likely is a conser- to encouraging extremism. There exists little research suggests that repetition of the mis-
vative estimate because the study tracked evaluation of the impacts of fake news in information before its correction may even
only 156 fake news stories. Another study these regards. be beneficial. Further research is needed to
reported that false information on Twitter reconcile these contradictions and determine
is typically retweeted by many more people, POTENTIAL INTERVENTIONS the conditions under which fact-checking in-
and far more rapidly, than true informa- What interventions might be effective at terventions are most effective.
tion, especially when the topic is politics (4). stemming the flow and influence of fake Another longer-run approach seeks to
Facebook has estimated that manipulations news? We identify two categories of inter- improve individual evaluation of the quality
by malicious actors accounted for less than ventions: (i) those aimed at empowering of information sources through education.
one-tenth of 1% of civic content shared on the individuals to evaluate the fake news they There has been a proliferation of efforts to
platform (5), although it has not presented encounter, and (ii) structural changes aimed inject training of critical-information skills
details of its analysis. at preventing exposure of individuals to fake in primary and secondary schools (12). How-
By liking, sharing, and searching for infor- news in the first instance. ever, it is uncertain whether such efforts im-
mation, social bots (automated accounts im- prove assessments of information credibility
personating humans) can magnify the spread Empowering individuals or if any such effects will persist over time. An
of fake news by orders of magnitude. By one There are many forms of fact checking, from emphasis on fake news might also have the
recent estimate—that classified accounts websites that evaluate factual claims of news unintended consequence of reducing the per-
based on observable features such as shar- reports, such as PolitiFact and Snopes, to ceived credibility of real-news outlets. There
ing behavior, number of ties, and linguistic evaluations of news reports by credible news is a great need for rigorous program evalu-
features—between 9 and 15% of active Twit- media, such as the Washington Post, to con- ation of different educational interventions.
ter accounts are bots (6). Facebook estimated textual information regarding content in-
that as many as 60 million bots (7) may be serted by intermediaries, such as those used Platform-based detection and intervention:
infesting its platform. They were responsible by Facebook. Algorithms and bots
for a substantial portion of political content Despite the apparent elegance of fact Internet platforms have become the most
posted during the 2016 U.S. campaign, and checking, science supporting its efficacy is, at important enablers and primary conduits of

SCIENCE sciencemag.org 8 MARCH 2018 • VOL 359 ISSUE 6380 3


INS IG HT S | P O L I C Y F O RU M

fake news. It is inexpensive to create a web- little research focused on fake news and no how those immense powers are being—and
site that has the trappings of a professional comprehensive data-collection system to should be—exercised and how to hold these
news organization. It has also been easy to provide a dynamic understanding of how massive companies to account.
monetize content through online ads and pervasive systems of fake news provision
social media dissemination. The internet are evolving. It is impossible to recreate the A FUTURE AGENDA
not only provides a medium for publishing Google of 2010. Google could not do so even Our call is to promote interdisciplinary re-
fake news but offers tools to actively promote if it had the underlying code, because the search to reduce the spread of fake news
dissemination. patterns emerge from a complex interaction and to address the underlying pathologies
About 47% of Americans overall report get- among code, content, and users. However, it it has revealed. Failures of the U.S. news me-
ting news from social media often or some- is possible to record what the Google of 2018 dia in the early 20th century led to the rise
times, with Facebook as, by far, the dominant is doing. More generally, researchers need to of journalistic norms and practices that, al-
source (13). Social media are key conduits for conduct a rigorous, ongoing audit of how the though imperfect, generally served us well by
fake news sites (3). Indeed, Russia success- major platforms filter information. striving to provide objective, credible infor-
fully manipulated all of the major platforms There are challenges to scientific collabo- mation. We must redesign our information
during the 2016 U.S. election, according to ration from the perspectives of industry and ecosystem in the 21st century. This must be
recent testimony (7). academia. Yet, there is an ethical and social global in scope, as many countries, some of
How might the internet and social media responsibility, transcending market forces, which have never developed a robust news
platforms help reduce the spread and impact for the platforms to contribute what data ecosystem, face challenges around fake and
of fake news? Google, Facebook, and Twitter they uniquely can to a science of fake news. real news that are more acute than in the
are often mediators not only of our relation- Possible effectiveness of platform-based United States. More broadly, we must answer
ship with the news media but also with our policies suggests either self-regulation by the a fundamental question: How can we create
friends and relatives. Generally, their busi- platforms or government intervention. Direct a news ecosystem and culture that values and
ness model relies on monetizing attention government regulation of an area as sensitive promotes truth? j
through advertising. They use complex statis- as news carries its own risks, constitutional
RE FE RE N CES AN D N OT ES
tical models to predict and maximize engage- and otherwise. For instance, could govern-
1. C. Wardle, H. Derakhshan, “Information disorder: Toward
ment with content (14). It should be possible ment regulators maintain (and, as important, an interdisciplinary framework for research and policy
to adjust those models to increase emphasis be seen as maintaining) impartiality in defin- making” [Council of Europe policy report DGI(2017)09,
Council of Europe, 2017]; https://firstdraftnews.com/
on quality information. ing, imposing, and enforcing any wp-content/uploads/2017/11/PREMS-162317-GBR-
Platforms could provide con- requirements? Generally, any di- 2018-Report-de%CC%81sinformation-1.pdf?x29719.
sumers with signals of source rect intervention by government 2. A. Swift, Americans’ trust in mass media sinks to new low

quality that could be incorpo- Pullquote or or the platforms that prevents


(Gallup, 2016); www.gallup.com/poll/195542/americans-
trust-mass-media-sinks-new-low.aspx.
rated into the algorithmic rank- liftout quote users from seeing content raises 3. H. Allcott, M. Gentzkow, J. Econ. Perspect. 31, 211 (2017).
4. S. Vosoughi et al., Science 359, XXX (2018).
ings of content. They could concerns about either govern-
minimize the personalization of piece tops ment or corporate censorship.
5. J. Weedon et al., Information operations and Facebook
(Facebook, 2017); https://fbnewsroomus.files.wordpress.
com/2017/04/facebook-and-information-operations-v1.
political information relative to on baseline An alternative to direct gov-
pdf.
other types of content (reducing ernment regulation would be to 6. O. Varol et al., in Proceedings of the 11th AAAI Conference on
the creation of “echo chambers”). as shown a enable tort lawsuits alleging, for Web and Social Media (Association for the Advancement of
Artificial Intelligence, Montreal, 2017), pp. 280–289.
Functions that emphasize cur- example, defamation by those di- 7. Senate Judiciary Committee, Extremist content
rently trending content could seek to exclude rectly and concretely harmed by the spread and Russian disinformation online: Working with
bot activity from measures of what is trend- of fake news. To the extent that an online tech to find solutions (Committee on the Judiciary,
2017); www.judiciary.senate.gov/meetings/
ing. More generally, platforms could curb the platform assisted in the spreading of a mani- extremist-content-and-russian-disinformation-online-
automated spread of news content by bots festly false (but still persuasive) story, there working-with-tech-to-find-solutions.
8. E. Ferrara, First Monday 22, 2017 (2017).
and cyborgs (users who automatically share might be avenues for liability consistent with 9. J. L. Kalla, D. E. Broockman, Am. Polit. Sci. Rev. 112, 148
news from a set of sources, with or without existing constitutional law, which, in turn, (2018).
reading them), although for the foreseeable would pressure platforms to intervene more 10. B. Swire et al., J. Exp. Psychol. Learn. Mem. Cogn. 43, 1948
(2017).
future, bot producers will likely be able to de- regularly. In the U.S. context, however, a pro- 11. U. K. H. Ecker et al., J. Appl. Res. Mem. Cogn. 6, 185 (2017).
sign effective countermeasures. vision of the 1996 Communications Decency 12. C. Jones, Bill would help California schools teach about
“fake news,” media literacy (EdSource, 2017); https://
The platforms have attempted each of Act offers near-comprehensive immunity to edsource.org/2017/bill-would-help-california-schools-
these steps and others (5, 15). Facebook an- platforms for false or otherwise actionable teach-about-fake-news-media-literacy/582363.
nounced an intent to shift its algorithm to statements penned by others. Any change to 13. Gottfried, E. Shearer, News use across social
media platforms 2017, Pew Research Center, 7
account for “quality” in its content curation this legal regime would raise thorny issues September 2017; www.journalism.org/2017/09/07/
process. Twitter announced that it blocked about the extent to which platform content news-use-across-social-media-platforms-2017/.
14. E. Bakshy et al., Science 348, 1130 (2015).
certain accounts linked to Russian misinfor- (and content-curation decisions) should be 15. C. Crowell, Our approach to bots & misinformation,
mation and informed users exposed to those subject to second guessing by people alleging Twitter, 14 June 2017; https://blog.twitter.com/official/
accounts that they may have been duped. injury. The European “right to be forgotten” en_us/topics/company/2017/Our-Approach-Bots-
Misinformation.html.
However, the platforms have not provided in search engines is testing these issues.
enough detail for evaluation by the research Structural interventions generally raise ACK N OW LE D G M E N TS
We acknowledge support from the Shorenstein Center at
community or subjected their findings to legitimate concerns about respecting private the Harvard Kennedy School and the NULab for Texts, Maps,
peer review, making them problematic for enterprise and human agency. But just as the and Networks at Northeastern University. D.L. acknowledges
use by policy-makers or the general public. media companies of the 20th century shaped support by the Economic and Social Research Council ES/
N012283/1. D.L. and M.B. contributed equally to this article. 
We urge platforms to collaborate with the information to which individuals were
independent academics on evaluating the exposed, the far-more-vast internet oligopo- SUP P LE M E N TARY M AT E RIAL
www.sciencemag.org/content/359/xxx/xxx/suppl/DC1
scope of the fake news issue and the design lies are already shaping human experience
and effectiveness of interventions. There is on a global scale. The questions before us are 10.1126/science.aao2998

4 8 MARCH 2018 • VOL 359 ISSUE 6380 sciencemag.org SCIENCE


View publication stats

You might also like