Professional Documents
Culture Documents
Overview
Significance of the Problem
As substantial technological advances have been made in society, social media has taken control
over huge parts of citizens’ daily lives. Facebook has about 3 billion users per month, TikTok has
1 billion monthly users, Twitter has 350 million users monthly, and Reddit has 250 million users
monthly1. Adding up to almost half the world’s population in magnitude, it showcases how
widespread social media’s reach is. Social media has shown what it means to globally unite or be
on the same page about a certain topic, creating a shared global consciousness unlike ever
before.
As social media has gained popularity, the need to specifically obtain news from actual news
sites such as CNN, New York Times, etc, has decreased. “In 2019, more than 70 percent of
American adults consumed news on social media, compared to fewer than one in eight
Americans in 2008.2” The transformation of social media into a news platform has significantly
impacted the definition of journalism and blurred the lines between fact and fiction.
1
Pew Research Center. 2023. “Social Media and News Fact Sheet.” Pew Research Center.
November 15, 2023.
https://www.pewresearch.org/journalism/fact-sheet/social-media-and-news-fact-sheet/
2
Levy, Ro’ee. 2021. “Social Media, News Consumption, and Polarization: Evidence from a
Field Experiment.” American Economic Review 111 (3): 831–70.
https://doi.org/10.1257/aer.20191777
3
A concern arises because it is much easier to spread misinformation on social media nowadays
with advanced technologies. Tools such as artificial intelligence, deep fakes, and highly detailed
editing software can cause false information to be interpreted as true. This is not just a concern,
but a genuine complication. During the Covid-19 pandemic, medical misinformation went
rampant on social media. Medical misinformation is the most dangerous and represents the
worst-case scenario of misinformation. “During COVID-19, falsifications of information led
people to decline vaccines, reject public health measures, and use unproven treatments. Health
misinformation has also led to harassment and violence against health workers.4” In certain
cases, misinformation has even led to death, where people believed in erroneous medical
techniques, ultimately causing them to fall ill when not treated properly. With an issue where
human lives are at stake, it is imperative to reduce or completely eliminate the problem of
misinformation. Information on social media platforms must be regulated more strictly but
through multiple means. Social media platforms must enact stronger rules for users who may
spread misinformation, such as banning from the app. The question of free speech comes in
3
ibit
4
U.S. Department of Health and Human Services. 2021. “Health Misinformation—Current
Priorities of the U.S. Surgeon General.” Www.hhs.gov. 2021.
https://www.hhs.gov/surgeongeneral/priorities/health-misinformation/index.html.
4
when the government is involved, so this topic must be trodden lightly with clear-cut limitations
targeting harmful misinformation.
5
Review of How Misinformation on Social Media Has Changed News. 2023. US PIRG. US
PIRG Education Fund. August 14, 2023.
https://pirg.org/edfund/articles/misinformation-on-social-media/#:~:text=By%20nudging%20fre
quent%20users%20to,fake%20news%20circulating%20on%20Facebook.
5
Stakeholders
Because social media is incredibly prevalent, it is not a far-off claim to make that every
individual with access to technology/social media (billions) is a stakeholder in this
misinformation epidemic. Especially with fatalities resulting from medical misinformation, lives
are at stake here. These individuals are the ones who are viewing media content, and sharing the
content. They can unintentionally spread false information or become victims of deceptive
information, affecting their beliefs, behaviors, and decisions. Social media companies are also
essential stakeholders, and arguably the most powerful stakeholders, as they host and
disseminate content, making them responsible for the spread of misinformation on their
platforms. These companies will also be the most responsible for regulating the content on their
private platforms, as the national/local government is very limited in taking away from the right
to free speech (the First Amendment).
Policy Deliberation
Existing Government Policies Addressing Misinformation
Because the rise in social media has been relatively new in our society, clear-cut laws and
regulations have not been established yet, but are in the early stages of development, with many
years ahead of them waiting for approval. Right now, there are still policies that may help cover
the scope of social media misinformation, and expanding upon these existing policies might be a
great way to start regulations.
Currently, Section 230 of the Communications Act of 1934 is very relevant to curbing the effects
of misinformation. This regulation allows providers of interactive computer services to bypass
being held legally responsible for harmful information that was developed on their services7.
Instead of shifting blame onto the hosting service, the individuals who spread misinformation
should be held accountable for the damage caused by their actions, particularly those who are the
actual speakers of harmful content. The act does not protect companies that create illegal or
harmful content, meaning companies are also held responsible for any misinformation they
spread. This act is convenient because it imposes legal consequences on individuals
disseminating harmful speech, which can include misinformation. Nevertheless, it primarily
shields small blogs, websites, major platforms, and individual users from liability, with only a
limited subset of extreme cases of misinformation resulting in legal repercussions.
6
Menczer, Filippo. 2021. Review of Here’s Exactly How Social Media Algorithms Can
Manipulate You. Big Think. Freethink Media. October 7, 2021.
https://bigthink.com/the-present/social-media-algorithms-manipulate-you/
7
“Section 230: An Overview.” 2021.
https://crsreports.congress.gov/product/pdf/R/R46751#:~:text=Section%20230%20of%20the%2
0Communications.
6
Potential Regulations
Existing Social Media Platform Rules
Currently, social media platforms have set methods of taking down false information. TikTok has
something called ‘community guidelines,’ which protects users from different sorts of harmful
information, anywhere from harassment/bullying to the topic of misinformation. Videos that do
not meet Tiktok Community Guidelines are promptly taken down, or marked with various
indicators showing that they might be spreading false information. Though these efforts by
Tiktok may seem to be sufficient, they are far from adequate. This multi-billion dollar platform is
undergoing investigation by the European Commission “due to concerns that it exposes young
users to inappropriate content9,” meaning that TikTok is not effective enough at removing
misinformation; there is always room for improvement with the billions of hours of content put
out daily.
8
D’Virgilio, Allegra. 2022. “The US Government’s Role in Regulating Social Media
Disinformation – Northeastern University Political Review.” Northeastern University Political
Review. May 19, 2022.
https://nupoliticalreview.org/2022/05/19/the-us-governments-role-in-regulating-social-media-disi
nformation/
9
“TikTok Shares Latest Data on Content Removals and Enforcement Actions.” n.d. Social Media
Today. https://www.tiktok.com/transparency/en-us/combating-misinformation/
7
Figure Showing that Removal of Content Has Decreased Despite More TikTok Users10
10
“Ibit
11
Madrid, Pamela. 2023. “Study Reveals Key Reason Why Fake News Spreads on Social
Media.” USC Today. 2023.
https://today.usc.edu/usc-study-reveals-the-key-reason-why-fake-news-spreads-on-social-media/.
8
accurate sharing practices is to incentivize accuracy, which social media platforms have the
power to do with their reward algorithms.
User Feedback
Another useful tool is user feedback. On most social media platforms, like TikTok, Instagram,
and Facebook, there are buttons to report platforms. This can help narrow down videos for
moderating algorithms and moderating staff to check, especially if the content is reported
specifically for misinformation. Additionally, the technological route can be taken for
misinformation. More algorithms and procedures can be developed using machine learning and
artificial intelligence to scan through thousands and thousands of pieces of content and check for
falsifications, much more efficiently than a human worker. Technological advancements can be
used for better purposes and improve the quality of information put into the world.
Limiting Protections
Certain states have taken steps forward, such as expanding on Section 230 by limiting on
protections that the Act gives social media companies. The Texas Supreme Court voted that in
12
“FTC Issues Orders to Social Media and Video Streaming Platforms Regarding Efforts to
Address Surge in Advertising for Fraudulent Products and Scams.” 2023. Federal Trade
Commission. March 16, 2023.
https://www.ftc.gov/news-events/news/press-releases/2023/03/ftc-issues-orders-social-media-vid
eo-streaming-platforms-regarding-efforts-address-surge-advertising.
9
cases of sex trafficking/ sexual violence, social media sites are not provided any protections.
Expanding upon cases where social media platforms are not protected is a great step in the right
direction. Clearly defining the problems with misinformation (from the many that exist) allows
for limitations with the First Amendment, and social media companies now have more incentive
to stop misinformation. When the platform is not afforded as many protections, it loses immunity
against legal trouble it could land in from the harm misinformation generates. No platform will
willingly land in legal trouble, nor do they want to be held accountable for their users’ actions, so
companies will invoke more rules and regulations against any users who spread disinformation.
International Reccomendations
The United States can also take recommendations from other countries. In Germany, The
Network Enforcement Act exists to strictly and promptly combat hate speech and fake news on
social media networks. The obligates covered social media networks to remove content that is
“clearly illegal” within 24 hours after receiving a user complaint. If the illegality of the content is
not obvious on its face, the social network has seven days to investigate and delete it. A social
media network may be fined up to 50 million euros (about US$59.2 million) for
noncompliance13. 50 million euros is a hefty fee that leaves social media companies in a rush to
remove falsified content immediately. Punishments on such a grand scale should be implemented
in the US to control companies, especially since there is a monopoly of social media networks in
the US.
Conclusion/Call to Action
In conclusion, the prevalence of misinformation on social media platforms presents a critical
problem in today’s digital-dependent era. Because social media sites are substituted for news
sources, medical misinformation runs rampant, along with other dangerous forms of false
information that can result in death. While governmental policies like Section 230 of the Federal
Trade Commission offer some protection, limitations in regulatory power hinder comprehensive
solutions. To combat this issue, a multi-pronged solution should be utilized, involving not only
the government, but also social media platforms and internal regulations.
The government must recognize the dangers of misinformation and play this to their advantage.
Under the First Amendment, speech that causes harm or danger is not protected, allowing for
greater government ordinance in online speech. Moreover, the government should explore
mandates inspired by international laws that place blame and legal repercussions on social media
sites, causing social media platforms to automatically increase their rules for users to avoid legal
trouble.
13
Gesley, Jenny. 2021. “Germany: Network Enforcement Act Amended to Better Fight Online
Hate Speech.” Library of Congress, Washington, D.C. 20540 USA. July 6, 2021.
https://www.loc.gov/item/global-legal-monitor/2021-07-06/germany-network-enforcement-act-a
mended-to-better-fight-online-hate-speech/.
10
Simultaneously, social media sites must administer changes to the reward structures of social
media. These changes must incentivize accurate content sharing over popular content, naturally
encouraging users to post more accurate content. Furthermore, sites should strictly punish users
for any false information that is spread and leverage technological advancements for content
moderation. Ultimately, only through collaborative efforts can we mitigate the impact of
misinformation and uphold the integrity of information dissemination on social media platforms.
11
Works Cited
D’Virgilio, Allegra. 2022. “The US Government’s Role in Regulating Social Media
Disinformation – Northeastern University Political Review.” Northeastern University
Political Review. May 19, 2022.
https://nupoliticalreview.org/2022/05/19/the-us-governments-role-in-regulating-social-me
dia-disinformation/
“FTC Issues Orders to Social Media and Video Streaming Platforms Regarding Efforts to
Address Surge in Advertising for Fraudulent Products and Scams.” 2023. Federal Trade
Commission. March 16, 2023.
https://www.ftc.gov/news-events/news/press-releases/2023/03/ftc-issues-orders-social-me
dia-video-streaming-platforms-regarding-efforts-address-surge-advertising.
Gesley, Jenny. 2021. “Germany: Network Enforcement Act Amended to Better Fight Online
Hate Speech.” Library of Congress, Washington, D.C. 20540 USA. July 6, 2021.
https://www.loc.gov/item/global-legal-monitor/2021-07-06/germany-network-enforceme
nt-act-amended-to-better-fight-online-hate-speech/.
Levy, Ro’ee. 2021. “Social Media, News Consumption, and Polarization: Evidence from a Field
Experiment.” American Economic Review 111 (3): 831–70.
https://doi.org/10.1257/aer.20191777
Madrid, Pamela. 2023. “Study Reveals Key Reason Why Fake News Spreads on Social Media.”
USC Today. 2023.
https://today.usc.edu/usc-study-reveals-the-key-reason-why-fake-news-spreads-on-social-
media/.
Menczer, Filippo. 2021. Review of Here’s Exactly How Social Media Algorithms Can
Manipulate You. Big Think. Freethink Media. October 7, 2021.
https://bigthink.com/the-present/social-media-algorithms-manipulate-you/
Pew Research Center. 2023. “Social Media and News Fact Sheet.” Pew Research Center.
November 15, 2023.
https://www.pewresearch.org/journalism/fact-sheet/social-media-and-news-fact-sheet/
Review of How Misinformation on Social Media Has Changed News. 2023. US PIRG. US PIRG
Education Fund. August 14, 2023.
https://pirg.org/edfund/articles/misinformation-on-social-media/#:~:text=By%20nudging
%20frequent%20users%20to,fake%20news%20circulating%20on%20Facebook
12
“TikTok Shares Latest Data on Content Removals and Enforcement Actions.” n.d. Social Media
Today. https://www.tiktok.com/transparency/en-us/combating-misinformation/