You are on page 1of 20

t

os
case W00C47
April 12, 2019

rP
Andrew Hoffman

Facebook: Fake News, Free Speech and an Internet


Platform’s Responsibility

yo
Mark Zuckerberg, the chief executive offcer of Facebook, sat in silence as he contemplated what he
would say to the Facebook Board of Directors in light of the public backlash the company was receiving.
Facebook had been thrust to the center of a national conversation about freedom of speech, fake news, and
data privacy following exploitation of the platform by Russians attempting to interfere in the 2016 United
States presidential election.
op
Since revelations after the election about Russian involvement, user mistrust of the platform had
spread rampantly. Compounding the public backlash, Facebook had received harsh criticisms from both
sides of the political aisle. In an effort to put the intrusions behind, Facebook executives systematically
attempted to downplay the extent of the problem and cover up its impacts. These efforts to defect and
skirt responsibility had recently been brought to the public’s attention, further exacerbating the challenges
tC

facing the company.

Since its founding in 2004, Facebook had rapidly grown from an online network of college students
to a global gathering place for billions of users to connect with friends and share information. Now a
multibillion-dollar tech giant, and the number one source of news for many people, the platform had
become a key player in the debate about fake news, free speech, and privacy in the internet era. While the
company had begun to take tangible steps addressing many of the free speech and privacy issues it faced,
No

it was unclear if these efforts were enough. Facebook could no longer escape the limelight.

As Zuckerberg pondered the appropriate next response, he couldn’t help but question how Facebook
got to this point in the frst place. He built Facebook with the mission to help people connect and share.
He did not anticipate the company becoming an arbiter of free speech. He kept grappling with the same
questions over and over again. As a platform, and not a publisher, what was Facebook’s level of liability for
Do

Published by WDI Publishing, a division of the William Davidson Institute (WDI) at the University of Michigan.
© 2019 Kimin Cho, Katherine Cunningham, Greg Phillips, and Ben Pollins. This case was written by University of Michigan graduate
students Kimin Cho, Katherine Cunningham, Greg Phillips, and Ben Pollins, under the supervision of Andrew Hoffman, Holcim (US)
Professor of Sustainable Enterprise, a position that holds joint appointments at the University of Michigan’s Ross School of Business
and School of Environment and Sustainability. The case was prepared as the basis for class discussion rather than to illustrate either
effective or ineffective handling of a situation. The case should not be considered criticism or endorsement and should not be used
as a source of primary data. The opening situation of this case is a dramatization created for the purpose of class discussion and
engagement.

This document is authorized for educator review use only by RAMONCITO JAVIER, HE OTHER until Apr 2021. Copying or posting is an infringement of copyright.
Permissions@hbsp.harvard.edu or 617.783.7860
Facebook: Fake News, Free Speech and an Internet Platform’s Responsibility W00C47

content posted on the site? Was fake news, a phenomenon that had been around for centuries, Facebook’s

t
responsibility, and was the company responsible for its users’ activity? Should Facebook self-regulate or

os
wait for legislators to act? He leaned back in his chair and exhaled loudly. What would be his next move?

History of Facebook

Facebook was a social media platform that allowed users to connect and share information with their

rP
network around the globe. The platform was created in February 2004 by Harvard undergraduate student
Mark Zuckerberg and his classmates Eduardo Saverin, Dustin Moskovitz, and Chris Hughes. When frst created,
Facebook was open only to Harvard students but it quickly expanded to include students from other top tier
American universities, beginning with Stanford, Columbia, and Yale. At the end of its frst year, the site had
reached one million active users and by the end of 2005 it had six million.1 By 2006, Facebook opened up
membership to anyone over the age of 13. The site continued to grow exponentially, and as of September
2018, Facebook had 1.49 billion daily active users and 2.27 billion monthly active users (see Exhibit 1).2

yo
Headquartered in Menlo Park, California, the company employed more than 33,600 people at 12 U.S. and
42 international offces.

The exponential growth of users was driven by Zuckerberg’s desire “to give people the power to share
in order to make the world more open and connected.”3 In 2017, Facebook updated its mission with the
goal of clarifying that Facebook’s intent was to foster positive connection: “Give people the power to build
community and bring the world closer together.”4
op
On Facebook, each user had an individual profle through which they could upload status updates,
post photos, and interact with friends. Users connected to other users in their social network by sending
friend requests, which could then be accepted or denied by the recipient. As of 2018, the service included
interactive features such as a personal timeline, a direct messenger function, and an online marketplace.
But the feature that came under the most scrutiny was the news feed.
tC

Launched in September 2006, the news feed revolutionized the site by propelling it to the primary
source of news for millions of users around the world, including 44% of the American population.5 The
news feed was driven by an algorithm that fltered news and updates according to what it calculated were
the contents most important to an individual user. One consequence was that users ended up in an echo-
chamber as they were exposed only to content that confrmed their pre-existing beliefs. The advent of
the Like button in 2009 made the news feed feature more popular, since it acted like a “social lubricant,”
providing users with positive affrmation and the feeling that they were being heard.6 Beyond the beneft
No

of increased user engagement, the Like button was a critical new tool that allowed Facebook to collect
personal data about its users. This data helped further refne the algorithm dictating what content showed
up on users’ news feeds.

Over time, the range of users on Facebook evolved and user interface with the site changed. When
Facebook frst launched, the site was used primarily by individuals to connect with their friends, share
updates and photos, and chat with each other. However, celebrities, politicians, businesses, nonprofts,
news media, and other organizations soon joined Facebook when they realized the scale at which they could
Do

reach and expand their audiences. Beginning in the run-up to the 2008 U.S. presidential election, Facebook
became a powerful tool for political organizing and campaigning. Over 1,000 Facebook groups were formed
in support of candidates Barack Obama and John McCain.7 In 2011, Facebook was used by activists in
Tunisia and Egypt to spark political revolution and topple dictators. Around the globe, Facebook became a
vital tool on small and large scales for grassroots organizing and political campaigning.

This document is authorized for educator review use only by RAMONCITO JAVIER, HE OTHER until Apr 2021. Copying or posting is an infringement of copyright.
Permissions@hbsp.harvard.edu or 617.783.7860
Facebook: Fake News, Free Speech and an Internet Platform’s Responsibility W00C47

As a free platform, Facebook’s business model focused on revenue from targeted advertisements. The

t
personal data Facebook collected from its users was grouped into demographic buckets, such as geographic

os
region, socio-economic status, political affliation, religion, sexual orientation, and gender.8 Facebook then
sold advertising space based on these buckets that allowed third parties to target specifc audiences. With
easy access to billions of people around the world, companies and nonproft organizations used Facebook
advertising services for large-scale marketing campaigns. Overall, approximately 99% of Facebook’s revenue
came from advertisements.9 Focusing on ads as its primary revenue stream worked well for Facebook;
between 2013 and 2017, Facebook’s revenue grew from $8 billion to more than $40 billion (see Exhibit 2).10

rP
In 2012, Facebook went public and raised more than $16 billion. It was the largest tech IPO in U.S.
history.11 The company then acquired Instagram, a photo- and video-sharing social media platform, and
WhatsApp, a chat platform. With ever-increasing scope and growth, concerns rose around user privacy,
freedom of speech, and the potential for Facebook to be used to spread misinformation and fake news.
These concerns erupted in the lead-up to the 2016 U.S. presidential election.

yo
2016 United States Presidential Election

The 2016 election was the most turbulent of a generation. In addition to rapidly changing economic
conditions and years of legislative gridlock was the unprecedented exploitation of social media by domestic
and foreign actors to infuence the election. While the extent of foreign interference was not wholly
revealed until months and years after the election, it soon came to light that not only were Facebook users
subject to manipulative tactics intended to degrade civic society in the United States, but that Facebook
op
itself had the opportunity to take more proactive steps to fght this attempt to undermine democracy.12

Fake News
During the 2016 presidential campaigns, fake news became a rampant epidemic on Facebook and
other leading social media platforms. One prominent example was a story about Democratic candidate
tC

Hillary Clinton and her campaign chairman, John Podesta, running a child sex ring out of the basement of
a Washington, D.C., pizza shop. The story incited a gunman to take matters into his own hands to “self-
investigate.” While no one was hurt in the incident, the story was indicative of the real-world implications
of politically charged fake news stories. Less extreme but equally false stories impacted the way the voting
population perceived candidates.13

Failure to Secure User Data and the Cambridge Analytica Scandal


No

Key to the effectiveness of false information campaigns was the exploitation of user data to identify
and target individuals with specifc political leanings. This data was often acquired in a nefarious or less
than transparent manner.

Outcries over privacy control emerged after the discovery that Cambridge Analytica, a British political
consulting frm that specialized in data mining and data analysis, had harvested private information
from more than 50 million Facebook users to target ads for the 2016 U.S. presidential election. The frm
developed a new form of paid political ads, designed for and directed at individuals using what was known
Do

as “psychoanalytics” for maximum psychological impact. This method was used to infuence the United
Kingdom’s 2016 vote to exit the European Union (EU), a use for which Facebook was later penalized by the
British House of Commons.14 Cambridge Analytica’s techniques would again be utilized for political purposes
by Donald Trump’s presidential campaign.15

This document is authorized for educator review use only by RAMONCITO JAVIER, HE OTHER until Apr 2021. Copying or posting is an infringement of copyright.
Permissions@hbsp.harvard.edu or 617.783.7860
Facebook: Fake News, Free Speech and an Internet Platform’s Responsibility W00C47

It was estimated that Cambridge Analytica was able to affect as many as 87 million U.S. social media

t
users prior to the election. This private information was acquired without the permission of users and used

os
to exploit the American electorate. Commenting on Cambridge Analytica’s modus operandi, a founder was
quoted as saying, “Rules don’t matter for them. For them, this is a war, and it’s all fair.”16 Moreover, it was
revealed that the frm was in contact with Kremlin-linked oligarchs as it applied these techniques.17

Beyond Cambridge Analytica, Facebook had a history of secretly sharing users’ personal data. Facebook
had partnerships with dozens of the world’s largest internet companies, including Apple, Microsoft, Amazon,

rP
and Russian search giant Yandex. For example, Facebook allowed Netfix and Spotify to read users’ private
messages without their consent. Facebook provided no detectable oversight in terms of protecting user
privacy shared in these partnerships.18

The Russian Effort to Infuence the Campaign


During the campaign leading up to the 2016 elections, Russian agents attempted to sow social discord

yo
among voters through the use of fabricated Facebook accounts.19 The Internet Research Agency (IRA), a
Russian government-funded group, purchased digital ads and organized sometimes-incendiary political
rallies in the United States. The group used fabricated identities to infame the American population through
intentionally divisive ads and images. This propaganda reached 150 million Americans through social media.
As the post-election investigation progressed, members of the IRA were charged by the U.S. Department of
Justice with subverting the 2016 election and supporting Donald Trump’s presidential campaign.20 Russia
sought greater diplomatic power on the world stage; a top cyber-intelligence adviser to President Vladimir
op
V. Putin was quoted as saying, “We are living in 1948.” The adviser’s reference to the eve of the frst Soviet
atomic bomb test came in a speech reported by The Washington Post. The adviser said, “I’m warning you: We
are at the verge of having something in the information arena that will allow to us to talk to the Americans
as equals.”21

Despite the unusually contentious election, and an internal fnding that Russian hackers were attempting
tC

to infuence the presidential election, Facebook had no policy or resources related to disinformation.
Shortly after the election, Zuckerberg would not accept the idea of Russian interference via Facebook in the
election and only later would an internal investigation shed light on the pervasive level to which the site
was exploited for nefarious purposes.22

History of Fake News and Misinformation


No

Fake news was not a new phenomenon. Attempts to spread false information have been traced back to
the beginnings of news in print. With the advent of the Gutenberg printing press in 1439, disseminating
news became much easier and quicker. With this invention came the spread of fake news as well. Many of
the earliest documented examples of fake news centered around religion. Various religious entities would
spread lies about a tragedy with the intent of instilling fear in people. Many were intended to drum up
hatred toward other religions.23

Fake news in the political sphere was also nothing new, playing a role in American politics from the
beginning. The founding fathers were known to spread propaganda in order to muster support for the
Do

revolution and to get people to enlist. For example, Benjamin Franklin, one of America’s most revered
founding fathers, published false stories about King George III.24 Even “Honest Abe” Lincoln used false
stories for political gain. During his run for offce he purchased a German-language newspaper in order to
publish stories praising himself in order to gain favor with immigrants.25

This document is authorized for educator review use only by RAMONCITO JAVIER, HE OTHER until Apr 2021. Copying or posting is an infringement of copyright.
Permissions@hbsp.harvard.edu or 617.783.7860
Facebook: Fake News, Free Speech and an Internet Platform’s Responsibility W00C47

Generally speaking, with the rise of reputable publications, journalistic norms took shape. In 1896

t
Adolph Ochs purchased the New York Times with the intent of creating a fact-based newspaper and proving

os
that news publishing need not be sensational to be proftable. Though varieties of bias in journalism
remained prevalent, principled, fact-based reporting became the standard.

The Internet brought about an immense challenge to journalistic integrity. While the Internet led
to greater access to information and more extensive news coverage than traditional publications, it also
enabled fake news and poor-quality journalism to fourish. Anyone with access to the Internet could post

rP
any content and reach a mass audience, and the rise of the “citizen journalist” decreased accountability
and credibility. Algorithms of news aggregators and search engines had no regard for accuracy or whether a
news story was grounded in fact. Additionally, social media changed the landscape of news, broadening the
pathways through which both journalists and non-journalists disseminated both accurate and misleading
information. While social media platforms such as Facebook and Twitter were not themselves the originators
of information, they aggregated news stories and provided customized feeds to their users. This structure,
in many ways, increased the diffculty users faced in determining the legitimacy of a source.26

yo
The Internet also eroded the fnancial fundamentals of traditional, reputable news organizations.
According to the Pew Research Center’s 2016 report on the state of the news media, advertising revenue
declined substantially, newsroom staffng was cut throughout the industry, and the number of newspapers
dwindled.27

Following the 2016 presidential election, the defnition of the term fake news was co-opted by Donald
op
Trump. While fake news generally referred to stories predicated on false information, Trump largely used the
term to describe news stories unfavorable to him and his administration.

According to academics who have studied the subject, fake news falls into six broad categories: satire,
news parody, news fabrication, photo manipulation, advertising, and propaganda. Common to all of these
six categories is the attempt to look and sound like real news in order to create a facade of legitimacy and
tC

credibility.28

Social Media Industry Overview

The issue of user privacy and fake news was not unique to Facebook. Other social media platforms faced
this challenge, including:
No

Twitter: Launched in 2006, Twitter was a real-time communications platform in which users had only
280 characters to express their thoughts (originally the character limit was 140). By 2018, approximately
335 million people used Twitter around the world, not always with veracity. For example, a 2018 study by
the Knight Foundation found 6.6 million tweets linked to fake and conspiracy news publishers in the month
before the 2016 U.S. presidential election.29

Instagram: Launched in 2010, Instagram was a photo- and video-sharing platform used by approximately
1 billion people per month as of 2018.30 Instagram was acquired by Facebook in 2012. Among its users,
Do

Instagram hosted fake accounts that spread fake news, automated apps that left spam comments, and
services that sold fake followers.

YouTube: Launched in 2005 and acquired by Google in 2006, the video-sharing website YouTube was
used by approximately 1.8 billion people per month as of 2018. Videos spreading misinformation and

This document is authorized for educator review use only by RAMONCITO JAVIER, HE OTHER until Apr 2021. Copying or posting is an infringement of copyright.
Permissions@hbsp.harvard.edu or 617.783.7860
Facebook: Fake News, Free Speech and an Internet Platform’s Responsibility W00C47

extreme ideologies proliferated on YouTube, thanks in part to a YouTube algorithm that favored videos with

t
extreme content. Infamously, terrorist groups such as ISIS and al-Qaeda used YouTube to upload recruitment

os
and propaganda videos. Another example was a set of conspiracy videos that emerged on the site in the
wake of the 2018 mass shooting at a high school in Parkland, Florida, saying that the survivors of the
shooting were “crisis actors.”32

Other Negative Impacts of Facebook

rP
In addition to the attention Facebook received for its role in the spread of fake news and the 2016
election, the company was scrutinized regarding use of its platform to disseminate hate speech and
perpetrate human rights violations.

Hate speech and bullying: Facebook was widely used to spread hate speech and as a platform for online
bullying. While the company reported that it caught and removed 95% of nudity, fake accounts, and graphic

yo
violence through reviewers and artifcial intelligence, it was less successful at fnding and removing hate
speech and bullying. Facebook caught only about 52% and 15% respectively of such incidents before they
were reported by users.33 The spread of hate speech online had signifcant implications. For example, studies
linked anti-refugee hate speech on Facebook and other social media in Germany to a spike in violent hate
crimes committed against refugees.34

Myanmar: According to reporting from the United Nations Human Rights Council, the Myanmar military
perpetrated massive, criminal human rights violations in three states in the Asian country. The violations
op
targeted the Rohingya population, a minority group living primarily in the Rakhine state in Myanmar.
Violations by the military included the killing of thousands of Rohingya, gang rape, and burning of villages.35
Facebook was used to spread anti-Muslim and anti-Rohingya material in the country, ultimately inciting
violence. According to an independent report commissioned by Facebook on its impact in these atrocities,
the company did not do enough to screen and prevent the dissemination of such materials, including not
fully enforcing Facebook’s existing community standards.36
tC

Terrorist propaganda: Social media, including Facebook, was used by international terrorist organizations
to disseminate propaganda and recruit members. Platforms were generally slow to respond to this growing
threat, but began increasing efforts to screen for this type of material.37 In a November 2018 blog post,
Facebook detailed actions to combat the spread of terrorist propaganda, including increasing usage of
machine learning to help identify potential posts from terrorist organizations.38
No

Legal Frameworks

Personal Data Laws


As of 2018, no single comprehensive law regulated the collection and usage of personal data in the
United States. Instead, a series of laws protected specifc criteria of personal data. For instance, the
Federal Trade Commission Act, which prohibited deceptive business activity, was used to prosecute frms for
failure to comply with their privacy policies or user agreements by disclosing personal data of customers
without their authorization. Multiple guidelines and regulations, created by government agencies and
Do

industry organizations, existed to steer private data management practices.39 For example, the Federal
Communications Commission (FCC) required internet access service providers to disclose all information
regarding management practices and commercial terms so that all consumers could make informed
decisions.40 Analysts noted that the problem of misusing private data could be diminished by providing
clear and easily accessible user agreements to the service users.
6

This document is authorized for educator review use only by RAMONCITO JAVIER, HE OTHER until Apr 2021. Copying or posting is an infringement of copyright.
Permissions@hbsp.harvard.edu or 617.783.7860
Facebook: Fake News, Free Speech and an Internet Platform’s Responsibility W00C47

International markets, which constituted the majority of Facebook’s active user base, differed from U.S.

t
legal treatment of personal online data. The EU updated its overarching personal data law, General Data

os
Protection Regulation (GDPR), in 2016. GDPR recognized protection of personal data as a fundamental right
of all citizens. While the GDPR was notably strict, the law could be overridden for the greater good of society
in situations such as a criminal investigation or addressing threats to public safety. GDPR also allowed more
generous usage of private data if the data were in an anonymous form. Private data laws in India, South
Korea, and Japan trended toward those of the EU.

rP
Freedom of Speech
The First Amendment of the United States Constitution prevented any government law abridging the
freedom of speech and of the press. Producers of misleading, false, or libelous news articles could be sued
for defamation or other offenses, but winning such suits was diffcult. In New York Times Co. v. Sullivan,
the Supreme Court proclaimed that malicious intent of libelous claims must be proven in the court for a
successful litigation.41 Because it was a private corporation, Facebook could implement private censorship

yo
rules to flter fake news articles within its platform. However, actions of this sort may have implications for
Facebook’s status as a platform with limited liability for user-produced content, unlike a publisher, which
assumes both editorial control as well as liability for the content.

Communications Decency Act, Section 230


In the United States, Section 230 of the 1996 Communications Decency Act largely governed the
level of liability companies assume for content disseminated on their online platforms. The statute drew a
op
distinction between “publishers” and “distributors,” making the former liable for all content featured on
their service while limiting liability for the latter. Generally, the distinction was made based on whether
an entity was capable of reviewing and controlling all of the content it provided to the public. Publishers
were liable for anything appearing in their products or services, as they knew what information they were
publishing. On the other hand, platforms were regarded as distributors (such as newsstands or libraries) if
they could not review all of the content they distributed.42
tC

The act stated that “No provider or user of an interactive computer service shall be treated as the publisher
or speaker of any information provided by another information content provider.”43 This statement refected
the assumption of the lawmakers that internet providers (at that time) could not possibly review or control all
content on their service platforms, just as newsstands could not review all of the publications they sell.

To counter the fake news epidemic, Facebook decided to implement stricter content monitoring practices.
No

Whether done manually or through algorithms, Facebook’s more comprehensive review plan indicated that it
was capable of monitoring all content shared on the site. However, this willingness and ability to monitor
content could cause Facebook to assume greater legal risks.44

For example, in 1994, Prodigy—an online service that hosted an anonymous fnance-themed bulletin—
was sued by Stratton Oakmont for libel, based on content on its site. Prodigy argued it could not be liable
for the content of anonymous posts as it was a platform, and not the publisher. However, courts found
Prodigy liable for the content because it moderated posts by censoring foul language.45 Should Facebook
start moderating all content in a more proactive manner, it could fnd itself subjected to lawsuits similar to
Do

that faced by Prodigy, in which the company had to take legal responsibility for its users’ posts.

This document is authorized for educator review use only by RAMONCITO JAVIER, HE OTHER until Apr 2021. Copying or posting is an infringement of copyright.
Permissions@hbsp.harvard.edu or 617.783.7860
Facebook: Fake News, Free Speech and an Internet Platform’s Responsibility W00C47

Campaign Finance Law

t
Fake news content was often funded by partisan groups. Under campaign fnance laws in the United

os
States, most of this content did not constitute a violation. If the content did not explicitly mention voting
for a specifc candidate, telling voters about another candidate’s negative personal history or current stances
was considered a party-building activity.46 Such party-building activities could be funded by “soft money,”
which was excluded from the control of federal funding law by Federal Election Commission (FEC) 1979
amendments.47 Facebook, as a private corporate entity, could autonomously screen campaign content of this

rP
sort. But, voluntary action on Facebook’s part could result in a political backlash and possible revenue loss.

Facebook’s Approach to Monitoring Free Speech

Over time, Facebook developed an internal process to identify content that violated the platform’s
community standards, such as misinformation, hate speech, and content that expressed support or praise
for dangerous individuals or organizations.48 Twice a month, several dozen Facebook employees, including

yo
engineers and lawyers, met to develop guidelines dictating what was and was not appropriate for users to
post on the site. These guidelines were then sent to thousands of Facebook moderators around the world.
The moderators would use the guidelines to determine whether specifc content should remain on the site or
be removed. However, the guidelines were developed piecemeal and could change week to week, and there
was no centralized document with a complete set of guidelines. Instead, the guidelines existed in numerous
PowerPoint presentations and Excel spreadsheets.49 (See Exhibit 3 for an example of a Facebook guideline.)
op
Facebook’s Post-Election Reaction

After the 2016 U.S. presidential election, Facebook’s leadership team was slow to acknowledge the
extent to which the site had been used to spread misinformation and fake news throughout the campaign
period. In multiple interviews, Zuckerberg expressed incredulity that Facebook could have swayed the
outcome of the election. It was only after intense pressure from lawmakers, regulators, researchers,
tC

journalists, employees, investors, and users that the leadership team began to understand the severity of
the issue and Facebook’s role in it.50

Facebook responded in part to these pressures through a series of technical steps. To address
misinformation, Facebook hired third-party fact-checking organizations to detect and fag suspicious
content and automated accounts.51 For example, when users would try to post a questionable news story,
a warning message would pop up that indicated the story was disputed by fact-checkers.52 Additionally,
No

Facebook started to remove fake accounts and ban so-called bad actors. Facebook began to register political
ads in a public database and content sponsors were required to provide their identity and domestic mailing
addresses.53 However, authorized advertisers were still able to conceal their identity by flling out the “paid
for by” section of an ad with any text they wanted.54 To address user privacy, Facebook limited what data
could be accessed by third-party apps and increased transparency for users regarding whether their personal
information was being shared.

Facebook’s public reaction to its role in the election differed from its internal conclusions. While in
the summer of 2017 one executive called the situation “a fve-alarm fre,” it was not until that October
Do

that the company acknowledged the infuence of Russian posts.55 Opposed to possible regulatory oversight,
chief operating offcer Sheryl Sandberg began a years-long effort to distract legislators and the public
regarding the extent of the fake news crisis. Facebook hired a consulting frm, Defners Public Affairs, to
disseminate positive content about Facebook, as well as negative content that highlighted “unsavory”

This document is authorized for educator review use only by RAMONCITO JAVIER, HE OTHER until Apr 2021. Copying or posting is an infringement of copyright.
Permissions@hbsp.harvard.edu or 617.783.7860
Facebook: Fake News, Free Speech and an Internet Platform’s Responsibility W00C47

business practices by Google and Apple.56 In March of 2018, Facebook suspended Cambridge Analytica

t
from its platform, but it was unclear whether leadership truly understood the public scrutiny the company

os
faced. A month later, Zuckerberg appeared before the U.S. Senate Commerce and Judiciary Committees. The
situation raised questions about Facebook’s actions but also about legislators’ ability to effectively regulate
internet companies, given their unfamiliarity with the topic. In any case, Facebook began to take more
proactive internal measures to demonstrate that regulation was not needed, while publicly presenting a
willingness for oversight.

rP
Evolving Challenges

Continued Activity Connected to Foreign Interference


In 2018, Facebook revealed attempts to use the site in ways similar to those of the 2016 presidential
campaign, with accounts linked to the IRA. This included a sequel to the 2017 white supremacist rally in
Charlottesville, Virginia, and a left-wing campaign to incite opposition to the U.S. Immigration and Customs

yo
Enforcement agency. Moreover, several vulnerable Democratic candidates were targeted by Russian hackers.

As foreign infuence operations continued to develop in their sophistication, Facebook assembled a


“War Room” prior to the 2018 congressional elections. The War Room was intended as a proactive means
of building defenses against such attacks. The group implemented dashboards that could track and stop
suspicious activity and alert security experts. It focused specifcally on disenfranchisement efforts and
those that could manifest real-world harm.57
op
Facebook executives characterized the effort as a “cat-and-mouse game.” The company took several
months to acknowledge the details of the Russian operation, including aspects of the more clandestine
techniques that Russian trolls used this time.58

Domestic Misinformation Campaigns


tC

In addition to foreign operatives, Facebook found itself increasingly contending with divisive messages
originating domestically from both the right and the left. “There are now well-developed networks of
Americans targeting other Americans with purposefully designed manipulations,” said Molly McKew, an
information warfare researcher at New Media Frontier, a frm that studied social media. This new source of
manipulative information raised free speech issues. “These networks are trying to manipulate people by
manufacturing consensus — that’s crossing the line over free speech,” said Ryan Fox, a co-founder of New
Knowledge, a frm that tracked disinformation.59
No

Fraught Reputation and Attracting New Employees


To remain competitive, Facebook needed to continue to attract the best and the brightest to its
workforce. The recent dilemmas had damaged the company’s reputation, creating a stigma that had dissuaded
some young people from seeking employment there.60 Employees at other large technology companies had
staged walkouts that led to changes in their companies’ strategies.61 In response, Facebook’s chief executive
began to hold meetings to address employee concerns and foster a culture of transparency and information
sharing.62
Do

Societal Forces and Facebook’s Effort in the 2018 Election


Election Day 2018 occurred with apparent integrity on social media due to pressure from multiple actors
in society including lawmakers, regulators, researchers, journalists, employees, investors and users.63 It was
unclear if Facebook could maintain this integrity, given the resources required in just the U.S. alone, much
9

This document is authorized for educator review use only by RAMONCITO JAVIER, HE OTHER until Apr 2021. Copying or posting is an infringement of copyright.
Permissions@hbsp.harvard.edu or 617.783.7860
Facebook: Fake News, Free Speech and an Internet Platform’s Responsibility W00C47

less in its global operations, as foreign agents continued to adapt their techniques. It was also not yet clear

t
if new regulations would emerge, or the possible specifcs of those regulations.

os
A Plan of Action Moving Forward

Zuckerberg sat back and contemplated the trajectory of Facebook. His company had taken demonstrable
measures in recent months to crack down on the dissemination of hateful content and the use of Facebook

rP
for nefarious political purposes. The company had also utilized certain public relations and political tactics
that received public condemnation. How would the company respond and recover? Were the company’s
current efforts to increase privacy and transparency suffcient? Should Facebook further self-regulate or wait
for legislators to catch up to a rapidly evolving industry? No clear path existed for the company given the
sticky nature of issues related to freedom of speech, but it was clear the company needed to take action
quickly.

yo
op
tC
No
Do

10

This document is authorized for educator review use only by RAMONCITO JAVIER, HE OTHER until Apr 2021. Copying or posting is an infringement of copyright.
Permissions@hbsp.harvard.edu or 617.783.7860
Facebook: Fake News, Free Speech and an Internet Platform’s Responsibility W00C47

Exhibits

t
os
Exhibit 1
Facebook’s User Growth

rP
yo
op
tC

Source: “Number of monthly active Facebook users worldwide as of 3rd quarter 2018 (in millions).” Statistica. https://www.statista.com/statistics/264810/number-of-monthly-
active-facebook-users-worldwide/. Accessed 16 Dec. 2018.
No
Do

11

This document is authorized for educator review use only by RAMONCITO JAVIER, HE OTHER until Apr 2021. Copying or posting is an infringement of copyright.
Permissions@hbsp.harvard.edu or 617.783.7860
Facebook: Fake News, Free Speech and an Internet Platform’s Responsibility W00C47

Exhibits (cont.)

t
os
Exhibit 2
Quarterly Revenue Growth

rP
yo
op
tC

Source: Investor.fb.com. “Facebook Q3 2018 Results.” https://s21.q4cdn.com/399680738/fles/doc_fnancials/2018/Q3/Q3-2018-Earnings-Presentation.pdf. Accessed 15 Dec.


2018.
No
Do

12

This document is authorized for educator review use only by RAMONCITO JAVIER, HE OTHER until Apr 2021. Copying or posting is an infringement of copyright.
Permissions@hbsp.harvard.edu or 617.783.7860
Facebook: Fake News, Free Speech and an Internet Platform’s Responsibility W00C47

Exhibits (cont.)

t
os
Exhibit 3
Examples of Facebook Moderators’ Guidelines

rP
yo
op
tC
No
Do

Source: Fisher, M. “Inside Facebook’s Secret Rulebook for Global Political Speech.” New York Times. 27 Dec. 2018. https://www.nytimes.com/2018/12/27/world/facebook-
moderators.html. Accessed 15 Jan. 2019.

13

This document is authorized for educator review use only by RAMONCITO JAVIER, HE OTHER until Apr 2021. Copying or posting is an infringement of copyright.
Permissions@hbsp.harvard.edu or 617.783.7860
Facebook: Fake News, Free Speech and an Internet Platform’s Responsibility W00C47

Exhibits (cont.)

t
os
Exhibit 4
Ads Russia Bought on Facebook in 2016

rP
yo
op
tC

Source: Shane, Scott. “These Are the Ads Russia Bought on Facebook in 2016.” New York Times. 1 Nov. 2017. Accessed 13 Dec. 2018.
No
Do

14

This document is authorized for educator review use only by RAMONCITO JAVIER, HE OTHER until Apr 2021. Copying or posting is an infringement of copyright.
Permissions@hbsp.harvard.edu or 617.783.7860
Facebook: Fake News, Free Speech and an Internet Platform’s Responsibility W00C47

Exhibits (cont.)

t
os
Exhibit 5
Example of a Fake Facebook Account

rP
yo
op
Source: Shane, Scott. “The Fake Americans Russia Created to Infuence the Election.” New York Times. 7 Sept. 2017. https://www.nytimes.com/2017/09/07/us/politics/russia-
facebook-twitter-election.html?module=inline. Accessed 8 Dec. 2018.
tC
No
Do

15

This document is authorized for educator review use only by RAMONCITO JAVIER, HE OTHER until Apr 2021. Copying or posting is an infringement of copyright.
Permissions@hbsp.harvard.edu or 617.783.7860
Facebook: Fake News, Free Speech and an Internet Platform’s Responsibility W00C47

Endnotes

t
Hall, M. “Facebook.” Encyclopaedia Brittania. https://www.britannica.com/topic/Facebook. Accessed 28 Nov. 2018.

os
1

2 “Facebook Company Info.” Facebook. https://newsroom.fb.com/company-info/. Accessed 13 Dec. 2018.


3 Bourg, Anya, James Jacoby, and Dana Priest. “The Facebook Dilemma.” Frontline. 29 Oct. 2018. https://www.pbs.org/wgbh/
frontline/flm/facebook-dilemma/. Accessed 12 Dec. 2018.
4 Constine, J. “Facebook changes mission statement to ‘bring the world closer together’.” TechCrunch. 22 June 2017.https://
techcrunch.com/2017/06/22/bring-the-world-closer-together/. Accessed 12 Dec. 2018.

rP
5 Tandoc, C. E., Z.W. Lim,,and R. Ling. “Defning ‘Fake News’.” Digital Journalism, 6:2, 137-153, DOI:
10.1080/21670811.2017.1360143. 30 Aug. 2017.
6 Bourg, Anya, James Jacoby, and Dana Priest. “The Facebook Dilemma.” Frontline. 29 Oct. 2018. https://www.pbs.org/wgbh/
frontline/flm/facebook-dilemma/. Accessed 12 Dec. 2018.
7 Hall, M. “Facebook.” Encyclopaedia Brittania. https://www.britannica.com/topic/Facebook. Accessed 28 Nov. 2018.
8 Sharma, R. “How Does Facebook Make Money?” Investopedia. 11 Dec. 2018. https://www.investopedia.com/ask/answers/120114/
how-does-facebook-fb-make-money.asp. Accessed 15 Dec. 2018.
9 Sharma, R. “How Does Facebook Make Money?” Investopedia. 11 Dec. 2018. https://www.investopedia.com/ask/answers/120114/

yo
how-does-facebook-fb-make-money.asp. Accessed 15 Dec. 2018.
10 D&B Hoovers. Facebook, Inc. Company Profle. Dun & Bradstreet, Inc. 2018. Accessed 14 Dec. 2018.
11 Rusli, E. and Eavis, P. “Facebook Raises $16 Billion in I.P.O.” New York Times. 17 May 2012. https://dealbook.nytimes.
com/2012/05/17/facebook-raises-16-billion-in-i-p-o/. Accessed 14 Feb. 2019.
12 Frenkel, S., N. Confessore, C. Kang, M. Rosenberg, and J. Nicas. “Delay, Deny and Defect: How Facebook’s Leaders Fought
Through Crisis.” New York Times. 14 Nov. 2018. https://www.nytimes.com/2018/11/14/technology/facebook-data-russia-
election-racism.html. Accessed 14 Feb. 2019.
op
13 Tandoc, C. E., Z.W. Lim,,and R. Ling. “Defning ‘Fake News’.” Digital Journalism, 6:2, 137-153, DOI:
10.1080/21670811.2017.1360143. 30 Aug. 2017.
14 Satariano, A., and S. Frenkel, S. “Facebook Fined in U.K. Over Cambridge Analytica Leak.” New York Times. 10 July 2018. https://
www.nytimes.com/2018/07/10/technology/facebook-fned-cambridge-analytica-britain.html. Accessed 14 Dec. 2018.
15 Confessore, N. “Cambridge Analytica and Facebook: The Scandal and the Fallout So Far.” New York Times. 4 Apr. 2018. https://
www.nytimes.com/2018/04/04/us/politics/cambridge-analytica-scandal-fallout.html. Accessed 13 Dec. 2018.
16 Rosenberg, M., N. Confessore, and C. Cadwalladr. “How Trump Consultants Exploited the Facebook Data of Millions.” New
tC

York Times. 17 Mar. 2018. https://www.nytimes.com/2018/03/17/us/politics/cambridge-analytica-trump-campaign.


html?module=inline. Accessed 13 Dec. 2018.
17 Granville, K. “Facebook and Cambridge Analytica: What You Need to Know as Fallout Widens.” New York Times.19 March 2018.
https://www.nytimes.com/2018/03/19/technology/facebook-cambridge-analytica-explained.html. Accessed 28 Nov. 2018.
18 Dance, G., M. LaForgia, and N. Confessore. “As Facebook Raised a Privacy Wall, It Carved an Opening for Tech Giants.” New York
Times. 18 Dec. 2018. https://www.nytimes.com/2018/12/18/technology/facebook-privacy.html. Accessed 19 Dec. 2018.
19 Oremus, W. “It’s Time for Facebook’s Workers to Speak Out.” Slate. 29 Nov. 2018. https://slate.com/technology/2018/11/
facebook-workers-should-speak-up-about-their-company-right-now.html?sid=575174a98cc2b280368b4567&utm_
No

medium=email&utm_campaign=traffc&utm_source=newsletter&utm_content=TheAngle. Accessed 11 Dec. 2018.


20 Oremus, W. “It’s Time for Facebook’s Workers to Speak Out.” Slate. 29 Nov. 2018. https://slate.com/technology/2018/11/
facebook-workers-should-speak-up-about-their-company-right-now.html?sid=575174a98cc2b280368b4567&utm_
medium=email&utm_campaign=traffc&utm_source=newsletter&utm_content=TheAngle. Accessed 11 Dec. 2018.
21 Shane, S. “The Fake Americans Russia Created to Infuence the Election.” New York Times. 7 Sep. 2017. https://www.nytimes.
com/2017/09/07/us/politics/russia-facebook-twitter-election.html?module=inline. Accessed 11 Dec. 2018.
22 Frenkel, S., N. Confessore, C. Kang, M. Rosenberg, and J. Nicas. “Delay, Deny and Defect: How Facebook’s Leaders Fought
Through Crisis.” New York Times. 14 Nov. 2018. https://www.nytimes.com/2018/11/14/technology/facebook-data-russia-
election-racism.html. Accessed 14 Feb. 2019.
Do

23 Soll, J., M. Severns, J. Shafer, and M. Grunwald. “The Long and Brutal History of Fake News.” Politico. 18 Dec. 2016. https://
www.politico.com/magazine/story/2016/12/fake-news-history-long-violent-214535. Accessed 10 Dec. 2018.
24 Soll, J., M. Severns, J. Shafer, and M. Grunwald. “The Long and Brutal History of Fake News.” Politico. 18 Dec. 2016. https://
www.politico.com/magazine/story/2016/12/fake-news-history-long-violent-214535. Accessed 10 Dec. 2018.
25 Holzer, H. “‘Honest Abe’ wasn’t above raucous debates, savvy politics.” CNN. 14 Mar. 2016. https://www.cnn.com/2016/03/10/
opinions/holzer-lincoln-douglas-debates/index.html. Accessed 5 Dec. 2018.

16

This document is authorized for educator review use only by RAMONCITO JAVIER, HE OTHER until Apr 2021. Copying or posting is an infringement of copyright.
Permissions@hbsp.harvard.edu or 617.783.7860
Facebook: Fake News, Free Speech and an Internet Platform’s Responsibility W00C47

26 Tandoc, C. E., Z.W. Lim,,and R. Ling. “Defning ‘Fake News’.” Digital Journalism, 6:2, 137-153, DOI:

t
10.1080/21670811.2017.1360143. 30 Aug. 2017.

os
27 “State of the News Media 2016.” Pew Research Center. 15 June 2016. http://assets.pewresearch.org/wp-content/uploads/
sites/13/2016/06/30143308/state-of-the-news-media-report-2016-fnal.pdf. Accessed 15 Dec. 2018.
28 Tandoc, C. E., Z.W. Lim,,and R. Ling. “Defning ‘Fake News’.” Digital Journalism, 6:2, 137-153, DOI:
10.1080/21670811.2017.1360143. 30 Aug. 2017.
29 “Disinformation, ‘Fake News’ and Infuence Campaigns on Twitter.” Knight Foundation. 4 Oct. 2018. https://www.
knightfoundation.org/reports/disinformation-fake-news-and-infuence-campaigns-on-twitter. Accessed 15 Dec. 2018.

rP
30 Constine, J. “Instagram hits 1 billion monthly users, up from 800M in September.” TechCrunch. 20 June 2018. https://
techcrunch.com/2018/06/20/instagram-1-billion-users/. Accessed 15 Dec. 2018.
31 Gilbert, B. “YouTube now has over 1.8 billion users every month, within spitting distance of Facebook’s 2 billion.” Business
Insider. 4 May 2018. https://www.businessinsider.com/youtube-user-statistics-2018-5. Accessed 15 Dec. 2018.
32 Ingram, M. “YouTube fnally decides it should care about misinformation.” The Media Today. 11 July 2018. https://www.cjr.org/
the_media_today/youtube-fake-news.php. Accessed 15 Dec. 2018.
33 Romm, T., and E. Dwoskin. “Facebook says it removed a food of hate speech, terrorist propaganda and fake accounts from its site.”
Washington Post. 15 Nov. 2018. https://www.washingtonpost.com/technology/2018/11/15/facebook-says-it-removed-food-hate-

yo
speech-terrorist-propaganda-fake-accounts-its-site/?noredirect=on&utm_term=.b8a69deed278. Accessed 16 Dec. 2018.
34 Müller, K., and C. Schwarz, C. “Fanning the Flames of Hate: Social Media and Hate Crime.” Stanford Social Innovation Review. 30
Nov. 2018. https://papers.ssrn.com/sol3/papers.cfm?abstract_id=3082972. Accessed 9 Dec. 2018.
35 “Myanmar: UN Fact-Finding Mission releases its full account of massive violations by military in Rakhine, Kachin and Shan
States.” United Nations Offce of the High Commissioner. 18 Sep. 2018. https://www.ohchr.org/EN/NewsEvents/Pages/
DisplayNews.aspx?NewsID=23575&LangID=E. Accessed 9 Dec. 2018.
36 Allison-Hope, D. “Our Human Rights Impact Assessment of Facebook in Myanmar.” Business for Social Responsibility. 5 Nov. 2018.
https://www.bsr.org/en/our-insights/blog-view/facebook-in-myanmar-human-rights-impact-assessment. Accessed 11 Dec. 2018.
op
37 Leetaru, K. “Can We Finally Stop Terrorists From Exploiting Social Media?” Forbes. 9 Oct. 2018. https://www.forbes.com/sites/
kalevleetaru/2018/10/09/can-we-fnally-stop-terrorists-from-exploiting-social-media/#42dec3156d80. Accessed 13 Dec. 2018.
38 “Hard Questions: What Are We Doing to Stay Ahead of Terrorists?” Facebook. 8 Nov. 2018. https://newsroom.fb.com/
news/2018/11/staying-ahead-of-terrorists/. Accessed 13 Dec. 2018.
39 Jolly, I. “Practical Law.” Thomson Reuters. 1 Oct. 2018. https://content.next.westlaw.com/
Document/I02064fbd1cb611e38578f7ccc38dcbee/View/FullText.html?contextData=(sc.
Default)&transitionType=Default&frstPage=true&bhcp=1. Accessed 14 Dec. 2018.
tC

40 “Electronic Code of Federal Regulations.” US Government Publishing Offce. 13 Dec. 2018. https://www.ecfr.gov/cgi-bin/text-idx?S
ID=2a8e6fcef18423ed25ac68af2a26207f&mc=true&node=se47.1.8_11&rgn=div8. Accessed 13 Dec. 2018.
41 “Libel.” Legal Information Institute; Cornell Law School. https://www.law.cornell.edu/wex/libel. Accessed 13 Dec. 2018.
42 “Section 230 Protections.” Electronic Frontier Foundation. https://www.eff.org/issues/bloggers/legal/liability/230. Accessed 13
Dec. 2018.
43 “Section 230 Protections.” Electronic Frontier Foundation. https://www.eff.org/issues/bloggers/legal/liability/230. Accessed 13
Dec. 2018.
No

44 “Who Can Be Sued?” Channel 4. 2018. Accessed 13 Dec. 2018. https://www.channel4.com/producers-handbook/media-law/


defamation/who-can-be-sued. Accessed 13 Dec. 2018.
45 Selyukh, A. “Section 230: A Key Legal Shield For Facebook, Google Is About To Change.” NPR. 21 Mar. 2018. https://www.npr.
org/sections/alltechconsidered/2018/03/21/591622450/section-230-a-key-legal-shield-for-facebook-google-is-about-to-change.
Accessed 13 Dec. 2018.
46 “What is the difference between soft money and hard money campaign donations?” HowStuffWorks. 26 Oct. 2000. https://money.
howstuffworks.com/question498.htm. Accessed 14 Dec. 2018. Accessed 14 Dec. 2018.
47 “Twenty Year Report.” Federal Election Commission. Apr. 1995. https://transition.fec.gov/pdf/20year.pdf. Accessed 13 Dec. 2018.
48 “Community Standards.” Facebook. https://www.facebook.com/communitystandards/. Accessed 27 Jan. 2019.
Do

49 Fisher, M. “Inside Facebook’s Secret Rulebook for Global Political Speech. New York Times. 27 Dec. 2018. https://www.nytimes.
com/2018/12/27/world/facebook-moderators.html. Accessed 28 Dec. 2018.
50 Roose, K. “Facebook Thwarted Chaos on Election Day. It’s Hardly Clear That Will Last.” New York Times. 7 Nov. 2018. https://www.
nytimes.com/2018/11/07/business/facebook-midterms-misinformation.html. Accessed 9 Dec. 2018.

17

This document is authorized for educator review use only by RAMONCITO JAVIER, HE OTHER until Apr 2021. Copying or posting is an infringement of copyright.
Permissions@hbsp.harvard.edu or 617.783.7860
Facebook: Fake News, Free Speech and an Internet Platform’s Responsibility W00C47

51 Fandos, N., and K. Roose. “Facebook Identifes an Active Political Infuence Campaign Using Fake Accounts.” New York Times.

t
31 July 2018. https://www.nytimes.com/2018/07/31/us/politics/facebook-political-campaign-midterms.html?module=inline.
Accessed 9 Dec. 2018.

os
52 Sonnad, N. “This is now what happens when you try to post fake news on Facebook.” Quartz. 19 Mar. 2017. https://
qz.com/936503/facebooks-new-method-of-fghting-fake-news-is-making-it-hard-for-people-to-post-a-false-story-about-irish-
slaves/. Accessed 16 Dec. 2018.
53 Fandos, N., and K. Roose. “Facebook Identifes an Active Political Infuence Campaign Using Fake Accounts.” New York Times.
31 July 2018. https://www.nytimes.com/2018/07/31/us/politics/facebook-political-campaign-midterms.html?module=inline.
Accessed 9 Dec. 2018.

rP
54 Roose, K. “Facebook Thwarted Chaos on Election Day. It’s Hardly Clear That Will Last.” New York Times. 7 Nov. 2018. https://www.
nytimes.com/2018/11/07/business/facebook-midterms-misinformation.html. Accessed 9 Dec. 2018.
55 Frenkel, S., N. Confessore, C. Kang, M. Rosenberg, and J. Nicas. “Delay, Deny and Defect: How Facebook’s Leaders Fought
Through Crisis.” New York Times. 14 Nov. 2018. https://www.nytimes.com/2018/11/14/technology/facebook-data-russia-
election-racism.html. Accessed 14 Feb. 2019.
56 Frenkel, S., N. Confessore, C. Kang, M. Rosenberg, and J. Nicas. “Delay, Deny and Defect: How Facebook’s Leaders Fought
Through Crisis.” New York Times. 14 Nov. 2018. https://www.nytimes.com/2018/11/14/technology/facebook-data-russia-
election-racism.html. Accessed 14 Feb. 2019.

yo
57 Frenkel, S., and M. Isaac. “Inside Facebook’s Election ‘War Room’.” New York Times. 19 Sep. 2018. https://www.nytimes.
com/2018/09/19/technology/facebook-election-war-room.html. Accessed 18 Nov. 2018.
58 Fandos, N., and K. Roose. “Facebook Identifes an Active Political Infuence Campaign Using Fake Accounts.” New York Times.
31 July 2018. https://www.nytimes.com/2018/07/31/us/politics/facebook-political-campaign-midterms.html?module=inline.
Accessed 9 Dec. 2018..
59 Frenkel, S. “Facebook Tackles Rising Threat: Americans Aping Russian Schemes to Deceive.” New York Times. 11 Oct. 2018.
https://www.nytimes.com/2018/10/11/technology/fake-news-online-disinformation.html?module=inline. Accessed 11 Dec. 2018.
60 Bowles, N. “‘I Don’t Really Want to Work for Facebook.’ So Say Some Computer Science Students.” New York Times. 15 Nov. 2018.
op
https://www.nytimes.com/2018/11/15/technology/jobs-facebook-computer-science-students.html?action=click&module=Top
Stories&pgtype=Homepage. Accessed 9 Dec. 2018.
61 Oremus, W. “It’s Time for Facebook’s Workers to Speak Out.” Slate. 29 Nov. 2018. https://slate.com/technology/2018/11/
facebook-workers-should-speak-up-about-their-company-right-now.html?sid=575174a98cc2b280368b4567&utm_
medium=email&utm_campaign=traffc&utm_source=newsletter&utm_content=TheAngle. Accessed 11 Dec. 2018.
62 Frenkel, S. “Zuckerberg Takes Steps to Calm Facebook Employees.” New York Times. 23 Mar. 2018. https://www.nytimes.
com/2018/03/23/technology/zuckerberg-facebook-employees.html. Accessed 12 Dec. 2018.
tC

63 Roose, K. “Facebook Thwarted Chaos on Election Day. It’s Hardly Clear That Will Last.” New York Times. 7 Nov. 2018. https://www.
nytimes.com/2018/11/07/business/facebook-midterms-misinformation.html. Accessed 9 Dec. 2018.
No
Do

18

This document is authorized for educator review use only by RAMONCITO JAVIER, HE OTHER until Apr 2021. Copying or posting is an infringement of copyright.
Permissions@hbsp.harvard.edu or 617.783.7860
Facebook: Fake News, Free Speech and an Internet Platform’s Responsibility W00C47

Notes

t
os
rP
yo
op
tC
No
Do

19

This document is authorized for educator review use only by RAMONCITO JAVIER, HE OTHER until Apr 2021. Copying or posting is an infringement of copyright.
Permissions@hbsp.harvard.edu or 617.783.7860
t
os
rP
The Erb Institute is committed to creating a socially and environmentally
sustainable society through the power of business. Building on nearly two

yo
decades of research, teaching, and direct engagement, the Institute has become
one of the world’s leading sources of innovative knowledge on the culture,
technologies, operations and governance of business in a changing world.
http://erb.umich.edu
op
tC
No

Established at the University of Michigan in 1992, the William Davidson Institute


(WDI) is an independent, non-profit research and educational organization focused on
providing private-sector solutions in emerging markets. Through a unique structure
that integrates research, field-based collaborations, education/training, publishing,
and University of Michigan student opportunities, WDI creates long-term value for
academic institutions, partner organizations, and donor agencies active in emerging
Do

markets. WDI also provides a forum for academics, policy makers, business leaders, and
development experts to enhance their understanding of these economies. WDI is one
of the few institutions of higher learning in the United States that is fully dedicated to
understanding, testing, and implementing actionable, private-sector business models
addressing the challenges and opportunities in emerging markets.

This document is authorized for educator review use only by RAMONCITO JAVIER, HE OTHER until Apr 2021. Copying or posting is an infringement of copyright.
Permissions@hbsp.harvard.edu or 617.783.7860

You might also like