You are on page 1of 3

Trigger Warning

In an era where we consume more than we create, how aware


are we of the dark reality behind the social media we use? Would
you believe filtering through graphic violence, exploitation,
extremism and abuse for more than 8 hours a day is a full-time
job? Well, it is. "Content Moderation"- in simple words, the
policing of social platforms. This is a profession that is
rarely, if ever, discussed. Sounds simple enough, right? Sit at
a desk all day, look at a couple of videos, take them down if
they disagree with the platform's community standards and sleep
soundly at night knowing you've saved the internet. Well, that
is far from the truth. Looking through the most gruesome things
for extended hours can adversely affect your mental state.

A former content moderator at whatsapp(hired on


contract-basis with accenture) has aptly described content
moderators as the "Coal workers of the Silicon Valley." Because
these roles are usually outsourced to contractors, the original
platforms do not monitor working conditions. We've all heard of
Facebook and Google's huge offices, with sleeping pods, gaming
spaces and a lot more. Sadly, the people who sustain these
platforms, soaking up the worst that this world offers, do not
enjoy the same benefits. While no amount of money could ever
make up for the mental toll that such a profession has, the
payroll doesn't even come close to that of a typical social
media employee! To add on to this, the psychological support
provided is laughable.

Sessions with 'wellness coaches' who aren't even qualified


doctors and weekly 'coloring sessions' prove to be inadequate
when dealing with the long-term mental-health consequences that
this job causes - PTSD, secondary trauma, and anxiety, to name a
few. Constant exposure to such graphic imagery will completely
disrupt an average person's well-being. It can even cause
intense paranoia. Being accustomed to proof of the infinite
array of human depravity can lead them to suspect the worst from
anyone they meet, wondering what their computers could hold, and
trusting people can become incredibly hard- if not impossible.
2

"I will engage in a mandatory wellness coaching session,


but I understand that those are not conditions and may not be
sufficient to prevent my contracting PTSD." – This is an actual
line from an NDA signed by a Facebook CM.
Several companies make content moderators sign stringent
Non-Disclosure Agreements(NDAs) that prevent employees from
discussing their work with friends, family and fellow employees.
The job description itself entails intense emotional stress, and
not being able to discuss this with colleagues only makes it
worse! Deciding whether a post should be taken down isn't a
simple yes/no process i.e. there are several rules and
regulations to be followed. Sometimes, even if a post contains
offensive material, it stays up because of certain loopholes in
the guidelines. A moderator is judged based on an "accuracy
score" that measures how often their decisions to withdraw posts
are deemed "correct" as per ever-changing social media policies.
An accuracy score of less than 98% can sometimes cost them their
job.

Social media giants continue to make tall claims regarding


the proficiency of Artificial Intelligence when it comes to
filtering out offensive material. Facebook CEO Mark Zuckerberg
said, "Facebook would use AI to detect 'the vast majority of
problematic content' by the end of 2019". Nevertheless,
statistics speak otherwise. Facebook's internal records show
that AI systems were only able to identify posts that generated
as little as 2% of hate speech views on the platform. There is
no single, perfect solution to this problem. It is extremely
difficult for tech companies to create software that efficiently
identifies derogatory content. Even so, enough funding and
research can improve the numbers to a great extent!
Unquestionably, content moderation is a mentally exacting
occupation. CMs must be provided proper psychological support
and adequate compensation while improving working conditions.

We're only human, after all. Even with the best counseling
in the world, investigating the heart of human darkness causes
irreparable damage. Most of the time, the people who post this
3

content are responsible for what's happening in the video. It is


chilling to think about how someone could be proud of being
responsible for such heinous offenses. Sort of makes us wonder
where humanity is heading, doesn't it?

All of this leads us to ask one question. What is the human


expense of keeping the internet clean?

You might also like