Professional Documents
Culture Documents
watching the video, I never necessarily thought about moderation. The most exposure I have had
about moderation was with my brother and his PlayStation, for his friends will say inappropriate
or hurtful statements in game chats or chat rooms and get banned minutes later. I honestly
believed before studying this discussion that it was only the company’s employees who were in
charge of moderation, and I assumed that the same people who handle my complaints on
changing my password or account name are the ones that handle moderation. After watching the
video, it makes a lot of sense to me that many platforms will hire teams of individuals trained to
deal with moderation concerns and that expertise only, but it really interested me the different
alternatives many big platforms use to moderate aside from hiring internal teams. Third-party
reviewers were not a huge shock, for many companies will hire on or branch out to third-party
companies to help out in some sector, but community managers were something I never thought
of before. I have multiple social media accounts, and there’s a few where I have joined groups
where I had to be accepted by an admin in order to join. Mainly Facebook groups, I have to
agree to the rules and policies before being accepted. I have seen hundreds of posts or comments
get taken down for violating guidelines, accounts get removed from the group, or personal posts
get denied from being posted by the admins. It is cool that Facebook has that tool to take some of
the heat off their employees and allow users to have a sense of control in something they’re truly
interested in. Gillespie highlights Reddit for utilizing this approach mostly, and while I haven’t
used my Reddit account as much, I do see edits on comments or little notes that an admin
removed a post or archived the thread. One last thing I really liked from the video was Gillespie
claiming that platform survival is dependent on moderation, and a platform without moderation
is not a platform in principle. Moderation isn’t something I think of often, so it’s never occurred
to me how platforms will manage without flags or report features. Twitter is a great example of
this, for you can find dozens of threads with inappropriate content posted and not being taken
down for days and there’s times where that content will be shared on innocent, normal threads. In
my opinion, it’s been worse since Elon Musk bought Twitter. Overall, I felt really enlightened
after watching this video and have been more mindful while on social media. I pay attention to
the deleted content I once saw or the features offered for reporting.