You are on page 1of 10

Facebook [Meta] and its Content Moderation Debacle: Navigating the Chaos of Online

Discourse

Simone T. Rodrigues

Bob Gaglardi School of Business and Economics, Thompson Rivers University

BUSN6010_01: Ethics and Corporate Social Responsibility

Ryan Scorgie

24th November, 2023.


Table of Contents

Introduction……………………………………………………………………………………….3

Issue Overview..................................................................................................................4

Content Moderation........................................................................................................4

Stakeholder Positions........................................................................................................5

Users..............................................................................................................................5

Meta...............................................................................................................................5

Shareholders..................................................................................................................5

Advertisers.....................................................................................................................6

Employees – Content Moderators..................................................................................6

Activists & Advocacy Groups.........................................................................................6

Government & Regulatory bodies..................................................................................6

Recommendations.............................................................................................................7

References.........................................................................................................................9

2
Facebook [Meta] and its Content Moderation Debacle: Navigating the Chaos of Online

Discourse

With the digital age, social media platforms have continued to evolve into an outlet for

expression, discussion, and information sharing among a global diverse community. Facebook,

owned by the parent company Meta, is currently the largest platform in the world with over 2.38

billion active monthly users and ranks third globally in terms of internet engagement (Singh,

2019). The vast number of users on this platform raises concerns pertaining to the protection of

their constitutional rights to freedom of speech and data privacy alongside safeguarding users

from and regulating harmful and hate speech.

Facebook prides itself in being a medium for self-expression, allowing users to share

their experiences across countries and cultures, empowering people to connect and

communicate. Meta’s goal is to ‘give people a voice’ and encourage them to freely and openly

discuss topics that are important to them through the form of written posts, images, and other

mediums available on the platform, regardless of whether these topics align with the opinions of

others (Facebook Community Standards | Transparency Center, n.d.). Meta’s commitment

towards building a free-speech oriented community makes it challenging to develop policies that

effectively tackle moderation of harmful user-generated content while also ensuring that these

policies are in line with the company ethos.

Ever since the Trump Election incident and the flood of misinformation during the

Coronavirus pandemic, Facebook has faced worldwide criticism for its poor community

standards and inadequate contingency plans to match the enormity of the situation. The sheer

volume and velocity of harmful content has reached unprecedented levels, and the impact of

these online atrocities now transcend the platforms where they occur, leading to a surge in the

criticism of these platforms and their inadequate content moderation practices. (Gillespie,

2020).

3
Issue Overview

Content Moderation

Grimmelmann (2015) defines content moderation as “the governance mechanisms that

structure participation in a community to facilitate cooperation and prevent abuse.” Platforms

use content moderation, which combines the application of both privately created platform

regulations and state legislation, to evaluate and control user-generated content and actions

(Gillett et al., 2023).

Over the past decade, Facebook’s facilitation of user-generated content and

encouragement of free speech has given rise to unwanted content like hate speech, terrorism

propaganda, harassment, bullying and graphic violence. Therefore, it is becoming critical for

Facebook to shift from their global uniform moderation systems to incorporating advanced

predictive based technology to control the type of content that its users are exposed to. Social

media platforms now regulate speech more than governments and political organisations, but

the guidelines and standards by which these businesses control content are ambiguous, and

the system is riddled with inaccuracies.

Facebook’s Transparency Center mentions that in adhering to their policy, in certain

cases, content that might go against community standards is allowed if it is of interest to the

general public or a common news phenomenon. Nevertheless, given the pervasiveness and

frequency of objectionable content on the platform, Meta needs to revise its strategies and

implement policies in compliance with legal frameworks to restrict such content and focus on

safe and positive user experiences.

Meta’s current moderation practices have faced scrutiny worldwide for the faulty violation

detections as well as the decisions taken by their Oversight Board. Their automated content

moderation system also lacks the adequate technology and there have been many instances in

the media where cases of critical misconduct have gone unnoticed while general user content

has been mistakenly flagged and banned.

4
Stakeholder Positions

Users

Meta’s failure of effective content moderation has caused a divide among its largest

stakeholder group. Although the current content moderation system was implemented to protect

users from the detrimental impact of the consumption of harmful content, many users also

believe that the system’s frequent errors have taken away their voice on the platform. One

group of users - those who have experienced detrimental effects on their mental health, privacy

and safety – advocate for Meta’s increased emphasis on user safety through the elimination or

restriction of certain content. On the other hand, users in the second group are irritated by the

arbitrary limitations placed on their accounts as a result of system errors and the infringement

on their freedom of speech (Myers, 2018).

Meta

As the parent company, the backlash that Facebook has faced for their poor content

moderation practices affects the brand, goodwill and reputation of this corporation. The brutal

media and public scrutiny, coupled with legal complications and court trials, creates a negative

perception of the company, resulting in huge losses. Meta must take a crucial decision between

staying committed to their free speech policy and taking corrective action to curb the spread of

offensive content by incorporating more stringent policies and effective algorithms.

Shareholders

Currently trading at its lowest level since early 2019, Meta's stock is among the worst

performers in the S&P 500 this year. Advertiser spending is declining, and user turnover is

creating the conditions for Meta to report a second consecutive quarter of declining revenue

(Vanian, 2022). While Facebook is not facing the risk of business closure, the tarnished brand

name and lack of social accountability is creating challenges to recruit new talent, maintain

profitability and hold shareholders’ interest in the company.

5
Advertisers

Advertisers around the world have taken action to protect their own brand image and

avoid being associated with a platform linked to hate speech and other objectionable content.

Although removing Facebook from their marketing strategies impacts the effectiveness of their

campaigns, brands would rather bear the loss than give the impression that they are

complacent of Meta’s content control regulations. Both the advertisers and Meta’s revenue from

advertising suffer as a result of this circumstance (Facebook Boycott Has Negative Impact on

Marketing Effectiveness for Participating Brands | Saïd Business School, 2020).

Employees – Content Moderators

Meta’s team of human moderators face tremendous pressure to regulate a sizeable

volume of user-generated content. The criticism that the company faces also impacts the

competency and reputation of these employees. Better working conditions, mental health

services, and more precise rules to help them deal with the challenges of moderating different

content are issues that content moderators frequently face.

Activists & Advocacy Groups

Activist groups are committed to maintaining a safe and healthy online environment.

These groups demand increased regulation of online content and urge Meta to strike a balance

between the use of the platform by users for free speech while at the same time, promoting the

well-being of those consuming the content.

Government & Regulatory bodies

Government and regulatory bodies have increased scrutiny of Meta’s content control

and community standards in the recent years. There has been considerable push for

transparency and accountability from the government in response to public demands. The

Government has been introducing stricter laws to hold the company accountable for the

elimination of harmful content. These regulations have been enforced to ensure the safety of the

public and responsible use of internet platform for engagement and discourse.

6
Recommendations

After analysis of the issue and its impact on stakeholders, the following

recommendations can be made to address the situation and prepare long-term plans and

policies.

1. Investment in AI to support content moderation: Additional funds should be allocated

towards building an improved algorithm-based predictive content moderation model to

support the decisions of the Oversight board and the team of human moderators. Recent

years have seen a rapid development in AI technology, which shows a promising future for

the integration of advanced AI with content moderation.

2. User Grievance Management: The faulty content control systems have led to the ban or

restriction of several harmless user accounts, inciting extreme user frustration. The current

procedure for account recovery, content reporting and appeals is complicated, ambiguous,

and not time sensitive. A dedicated team for faster resolution processes with clear and

transparent reports for cases can help relieve user frustration.

3. Revised company ideology: The evolution of times into the digital age necessitates a

revision of the company’s ideology and commitment to free speech and expression on the

platform. It is important that users are made aware of what is acceptable on the platform and

the consequences that they might face for violation of the new and improved community

standards.

4. Content Moderation Policies: In collaboration with the government and regulatory bodies, a

clear set of policies should be formulated for uniform content moderation across the platform

globally. These policies should include regular monitoring and publishing frequent reports on

the efficacy of the program to ensure transparency and accountability. Stringent rules and

penalties should be established for violators to keep under control the users who have a

knack for circumventing moderation practices. The company should also cooperate with the

government and comply with regulations pertaining to user safety.

7
5. User education and Media literacy campaigns: As one of the platforms with the largest user

base, Meta should take on the responsibility of educating users about digital ethics,

responsible online conduct and appropriate use of the platform. The focal point of the

company’s advertisements and promotional materials should be shifted from the platform’s

features to user’s accountability and ability to responsibly access, use and engage with

content on the platform.

6. Open discussions with Activist and Advocacy groups: Panel discussions can be held with

activist groups to understand the impact of the company’s business practices on several

neglected groups of people in order to take remedial action. A open dialogue should be

encouraged to gather valuable insights into what the user community expects from the

business and how Meta can work collaboratively with these groups to achieve stakeholder

satisfaction.

7. Transparency reports: Meta’s current transparency reports involves merely quarterly

numbers of the issue addressed. Although this provides a certain amount of transparency,

there is still opportunity for improvement in terms of measuring the degree of progress and

the influence of new approaches to content moderation problems. These reports currently

concentrate on quantitative information about the content that has been identified, but they

do not provide a thorough evaluation of how well implemented measures have worked over

time. Additional metrics that gauge the degree of progress in content moderation should be

incorporated. This could involve monitoring the decline in the frequency of offensive content,

the promptness of responses to concerns raised, and the precision of content deletion.

8. Feedback from Stakeholders: Regular surveys can be conducted with various stakeholder

groups to understand the shortcomings of the company’s strategy implementation and

feedback can be collected on how to enhance the effectiveness of these policies.

By implementing these recommendation, Meta can bolster the confidence trust and confidence

among all the stakeholders involved in this content moderation issue.

8
References

Facebook boycott has negative impact on marketing effectiveness for participating

brands | Saïd Business School. (2020, July 2). Www.sbs.ox.ac.uk.

https://www.sbs.ox.ac.uk/news/facebook-boycott-has-negative-impact-marketing-effectiveness-

participating-brands

Facebook Community Standards | Transparency Center. (n.d.). Transparency.fb.com.

https://transparency.fb.com/en-gb/policies/community-standards/

Gillespie, T. (2020). Content moderation, AI, and the question of scale. Big Data &

Society, 7(2), 205395172094323. https://doi.org/10.1177/2053951720943234

Grimmelmann, J. (2015). The Virtues of Moderation. Yale Journal of Law and Technology,

17, 42-109.

Leung, J. (2022). Shortcuts and Shortfalls in Meta’s Content Moderation Practices: A

Glimpse from its Oversight Board’s First Year of Operation. Comparative Law and

Language, 1(2).

Myers West, S. (2018). Censored, suspended, shadowbanned: User interpretations of

content moderation on social media platforms. New Media & Society, 20(11), 4366-4383.

https://doi.org/10.1177/1461444818773059

Singh, S. (2019). Everything in Moderation: An Analysis of How Internet Platforms Are

Using Artificial Intelligence to Moderate User- Generated Content. In New America.

newamerica.org/oti/reports/everything-moderation-analysis-how-internet-platforms-are-using-

artificialintelligence-moderate-user-generated-content/

Vanian, J. (2022, September 30). Facebook scrambles to escape stock’s death spiral as

users flee, sales drop. CNBC. https://www.cnbc.com/2022/09/30/facebook-scrambles-to-

escape-death-spiral-as-users-flee-sales-drop.html

9
10

You might also like