Professional Documents
Culture Documents
Discourse
Simone T. Rodrigues
Ryan Scorgie
Introduction……………………………………………………………………………………….3
Issue Overview..................................................................................................................4
Content Moderation........................................................................................................4
Stakeholder Positions........................................................................................................5
Users..............................................................................................................................5
Meta...............................................................................................................................5
Shareholders..................................................................................................................5
Advertisers.....................................................................................................................6
Recommendations.............................................................................................................7
References.........................................................................................................................9
2
Facebook [Meta] and its Content Moderation Debacle: Navigating the Chaos of Online
Discourse
With the digital age, social media platforms have continued to evolve into an outlet for
expression, discussion, and information sharing among a global diverse community. Facebook,
owned by the parent company Meta, is currently the largest platform in the world with over 2.38
billion active monthly users and ranks third globally in terms of internet engagement (Singh,
2019). The vast number of users on this platform raises concerns pertaining to the protection of
their constitutional rights to freedom of speech and data privacy alongside safeguarding users
Facebook prides itself in being a medium for self-expression, allowing users to share
their experiences across countries and cultures, empowering people to connect and
communicate. Meta’s goal is to ‘give people a voice’ and encourage them to freely and openly
discuss topics that are important to them through the form of written posts, images, and other
mediums available on the platform, regardless of whether these topics align with the opinions of
towards building a free-speech oriented community makes it challenging to develop policies that
effectively tackle moderation of harmful user-generated content while also ensuring that these
Ever since the Trump Election incident and the flood of misinformation during the
Coronavirus pandemic, Facebook has faced worldwide criticism for its poor community
standards and inadequate contingency plans to match the enormity of the situation. The sheer
volume and velocity of harmful content has reached unprecedented levels, and the impact of
these online atrocities now transcend the platforms where they occur, leading to a surge in the
criticism of these platforms and their inadequate content moderation practices. (Gillespie,
2020).
3
Issue Overview
Content Moderation
use content moderation, which combines the application of both privately created platform
regulations and state legislation, to evaluate and control user-generated content and actions
encouragement of free speech has given rise to unwanted content like hate speech, terrorism
propaganda, harassment, bullying and graphic violence. Therefore, it is becoming critical for
Facebook to shift from their global uniform moderation systems to incorporating advanced
predictive based technology to control the type of content that its users are exposed to. Social
media platforms now regulate speech more than governments and political organisations, but
the guidelines and standards by which these businesses control content are ambiguous, and
cases, content that might go against community standards is allowed if it is of interest to the
general public or a common news phenomenon. Nevertheless, given the pervasiveness and
frequency of objectionable content on the platform, Meta needs to revise its strategies and
implement policies in compliance with legal frameworks to restrict such content and focus on
Meta’s current moderation practices have faced scrutiny worldwide for the faulty violation
detections as well as the decisions taken by their Oversight Board. Their automated content
moderation system also lacks the adequate technology and there have been many instances in
the media where cases of critical misconduct have gone unnoticed while general user content
4
Stakeholder Positions
Users
Meta’s failure of effective content moderation has caused a divide among its largest
stakeholder group. Although the current content moderation system was implemented to protect
users from the detrimental impact of the consumption of harmful content, many users also
believe that the system’s frequent errors have taken away their voice on the platform. One
group of users - those who have experienced detrimental effects on their mental health, privacy
and safety – advocate for Meta’s increased emphasis on user safety through the elimination or
restriction of certain content. On the other hand, users in the second group are irritated by the
arbitrary limitations placed on their accounts as a result of system errors and the infringement
Meta
As the parent company, the backlash that Facebook has faced for their poor content
moderation practices affects the brand, goodwill and reputation of this corporation. The brutal
media and public scrutiny, coupled with legal complications and court trials, creates a negative
perception of the company, resulting in huge losses. Meta must take a crucial decision between
staying committed to their free speech policy and taking corrective action to curb the spread of
Shareholders
Currently trading at its lowest level since early 2019, Meta's stock is among the worst
performers in the S&P 500 this year. Advertiser spending is declining, and user turnover is
creating the conditions for Meta to report a second consecutive quarter of declining revenue
(Vanian, 2022). While Facebook is not facing the risk of business closure, the tarnished brand
name and lack of social accountability is creating challenges to recruit new talent, maintain
5
Advertisers
Advertisers around the world have taken action to protect their own brand image and
avoid being associated with a platform linked to hate speech and other objectionable content.
Although removing Facebook from their marketing strategies impacts the effectiveness of their
campaigns, brands would rather bear the loss than give the impression that they are
complacent of Meta’s content control regulations. Both the advertisers and Meta’s revenue from
advertising suffer as a result of this circumstance (Facebook Boycott Has Negative Impact on
volume of user-generated content. The criticism that the company faces also impacts the
competency and reputation of these employees. Better working conditions, mental health
services, and more precise rules to help them deal with the challenges of moderating different
Activist groups are committed to maintaining a safe and healthy online environment.
These groups demand increased regulation of online content and urge Meta to strike a balance
between the use of the platform by users for free speech while at the same time, promoting the
Government and regulatory bodies have increased scrutiny of Meta’s content control
and community standards in the recent years. There has been considerable push for
transparency and accountability from the government in response to public demands. The
Government has been introducing stricter laws to hold the company accountable for the
elimination of harmful content. These regulations have been enforced to ensure the safety of the
public and responsible use of internet platform for engagement and discourse.
6
Recommendations
After analysis of the issue and its impact on stakeholders, the following
recommendations can be made to address the situation and prepare long-term plans and
policies.
support the decisions of the Oversight board and the team of human moderators. Recent
years have seen a rapid development in AI technology, which shows a promising future for
2. User Grievance Management: The faulty content control systems have led to the ban or
restriction of several harmless user accounts, inciting extreme user frustration. The current
procedure for account recovery, content reporting and appeals is complicated, ambiguous,
and not time sensitive. A dedicated team for faster resolution processes with clear and
3. Revised company ideology: The evolution of times into the digital age necessitates a
revision of the company’s ideology and commitment to free speech and expression on the
platform. It is important that users are made aware of what is acceptable on the platform and
the consequences that they might face for violation of the new and improved community
standards.
4. Content Moderation Policies: In collaboration with the government and regulatory bodies, a
clear set of policies should be formulated for uniform content moderation across the platform
globally. These policies should include regular monitoring and publishing frequent reports on
the efficacy of the program to ensure transparency and accountability. Stringent rules and
penalties should be established for violators to keep under control the users who have a
knack for circumventing moderation practices. The company should also cooperate with the
7
5. User education and Media literacy campaigns: As one of the platforms with the largest user
base, Meta should take on the responsibility of educating users about digital ethics,
responsible online conduct and appropriate use of the platform. The focal point of the
company’s advertisements and promotional materials should be shifted from the platform’s
features to user’s accountability and ability to responsibly access, use and engage with
6. Open discussions with Activist and Advocacy groups: Panel discussions can be held with
activist groups to understand the impact of the company’s business practices on several
neglected groups of people in order to take remedial action. A open dialogue should be
encouraged to gather valuable insights into what the user community expects from the
business and how Meta can work collaboratively with these groups to achieve stakeholder
satisfaction.
numbers of the issue addressed. Although this provides a certain amount of transparency,
there is still opportunity for improvement in terms of measuring the degree of progress and
the influence of new approaches to content moderation problems. These reports currently
concentrate on quantitative information about the content that has been identified, but they
do not provide a thorough evaluation of how well implemented measures have worked over
time. Additional metrics that gauge the degree of progress in content moderation should be
incorporated. This could involve monitoring the decline in the frequency of offensive content,
the promptness of responses to concerns raised, and the precision of content deletion.
8. Feedback from Stakeholders: Regular surveys can be conducted with various stakeholder
By implementing these recommendation, Meta can bolster the confidence trust and confidence
8
References
https://www.sbs.ox.ac.uk/news/facebook-boycott-has-negative-impact-marketing-effectiveness-
participating-brands
https://transparency.fb.com/en-gb/policies/community-standards/
Gillespie, T. (2020). Content moderation, AI, and the question of scale. Big Data &
Grimmelmann, J. (2015). The Virtues of Moderation. Yale Journal of Law and Technology,
17, 42-109.
Glimpse from its Oversight Board’s First Year of Operation. Comparative Law and
Language, 1(2).
content moderation on social media platforms. New Media & Society, 20(11), 4366-4383.
https://doi.org/10.1177/1461444818773059
newamerica.org/oti/reports/everything-moderation-analysis-how-internet-platforms-are-using-
artificialintelligence-moderate-user-generated-content/
Vanian, J. (2022, September 30). Facebook scrambles to escape stock’s death spiral as
escape-death-spiral-as-users-flee-sales-drop.html
9
10