You are on page 1of 16

See discussions, stats, and author profiles for this publication at: https://www.researchgate.

net/publication/342325594

Research Note: Spreading Hate on TikTok

Article in Studies in Conflict and Terrorism · June 2020


DOI: 10.1080/1057610X.2020.1780027

CITATIONS READS

168 7,740

2 authors, including:

Gabriel Weimann
University of Haifa
618 PUBLICATIONS 16,983 CITATIONS

SEE PROFILE

All content following this page was uploaded by Gabriel Weimann on 11 May 2021.

The user has requested enhancement of the downloaded file.


Studies in Conflict & Terrorism

ISSN: 1057-610X (Print) 1521-0731 (Online) Journal homepage: https://www.tandfonline.com/loi/uter20

Research Note: Spreading Hate on TikTok

Gabriel Weimann & Natalie Masri

To cite this article: Gabriel Weimann & Natalie Masri (2020): Research Note: Spreading Hate on
TikTok, Studies in Conflict & Terrorism, DOI: 10.1080/1057610X.2020.1780027

To link to this article: https://doi.org/10.1080/1057610X.2020.1780027

Published online: 19 Jun 2020.

Submit your article to this journal

Article views: 7

View related articles

View Crossmark data

Full Terms & Conditions of access and use can be found at


https://www.tandfonline.com/action/journalInformation?journalCode=uter20
STUDIES IN CONFLICT & TERRORISM
https://doi.org/10.1080/1057610X.2020.1780027

Research Note: Spreading Hate on TikTok


Gabriel Weimann and Natalie Masri
Institute for Counter Terrorism (ICT), Herzliya, Israel

ABSTRACT ARTICLE HISTORY


TikTok is the fastest-growing application today, attracting a huge Received 18 May 2020
audience of 1.5 billion active users, mostly children and teenagers. Accepted 25 May 2020
Recently, the growing presence of extremist’s groups on social
media platforms became more prominent and massive. Yet, while
most of the scholarly attention focused on leading platforms like
Twitter, Facebook or Instagram, the extremist immigration to other
platforms like TikTok went unnoticed. This study is a first attempt to
find the Far-right’s use of TikTok: it is a descriptive analysis based on
a systematic content analysis of TikTok videos, posted in early 2020.
Our findings reveal the disturbing presence of Far-right extremism in
videos, commentary, symbols and pictures included in TikTok’s post-
ings. While similar concerns were with regard to other social plat-
forms, TikTok has unique features to make it more troublesome.
First, unlike all other social media TikTok’ s users are almost all
young children, who are more naïve and gullible when it comes to
malicious contents. Second, TikTok is the youngest platform thus
severely lagging behind its rivals, who have had more time to grap-
ple with how to protect their users from disturbing and harmful con-
tents. Yet, TikTok should have learned from these other platforms’
experiences and apply TikTok’s own Terms of Service that does not
allow postings that are deliberately designed to provoke or antagon-
ize people, or are intended to harass, harm, hurt, scare, distress,
embarrass or upset people or include threats of physical violence.

TikTok is one of the most popular applications in the world: hundreds of millions of
users, many of them children and teenagers, use it to upload, watch and browse lip
sync videos and memes. TikTok, developed by ByteDance, a Chinese company, allows
users to upload up to 60-second lip-synched videos with a variety of creative and inter-
active features. It is the fastest-growing app and is ranked the seventh most downloaded
app of the past decade. However, this app has a darker side. TikTok users are sharing
calls for violence against people of color and Jews, as well as creating and sharing neo-
Nazi propaganda. Technically, according to the company’s Terms of Service, TikTok
does not allow people under 13 years of age to use its platform, but many of its users in
videos are clearly younger. The popularity and vulnerability of TikTok were noted by
various extremist groups including numerous far-right activists and groups.1 This study
examines their postings on TikTok.

CONTACT Gabriel Weimann weimann@com.haifa.ac.il Institute for Counter Terrorism (ICT), Herzliya, Israel.
ß 2020 Taylor & Francis Group, LLC
2 G. WEIMANN AND N. MASRI

The Rise of Far-Right Terrorism


Far-right violence and terrorism are a growing threat to Western societies. Far-right
terrorist attacks increased by 320 per cent between 2014 and 2019 according to the
2019 Global Terrorism Index.2 In 2018 alone, far-right terrorist attacks made up
17.2% of all terrorist incidents in the West, compared to Islamic groups which made
up 6.28% of all attacks3. In January 2019, the Anti-Defamation League’s Center on
Extremism reported that every extremist killing in the U.S. in 2018 was linked to far-
right individuals or organizations.4 German authorities registered 8,605 right-wing
extremist offenses including 363 violent crimes in the first half of 2019. Compared to
the first half of 2018, an increase of 900 far-right crimes was recorded during the
same period. Far-right terrorism is on average five times deadlier than far-left terror-
ism, with an average of 0.92 deaths per attack compared to far-left terrorism with 0.17
deaths. 5 Nineteen countries across North America, Western Europe and Oceania have
been targeted by far-right attackers. This trend in far-right attacks has led some
observers to state that far-right domestic terrorism has not been treated seriously
enough in the West and that security and intelligence services should pay closer atten-
tion to this emerging threat.6
“Far-right” refers to a political ideology that centers on one or more of the following
elements: strident nationalism (usually racial or exclusivist in some fashion), fascism,
racism, anti-Semitism, anti-immigration, chauvinism, nativism, anti-LGBTQ, and xeno-
phobia. There is also some overlap between far-right groups and Incel movements due
to their shared beliefs of hatred and intolerance.7 Incel is short for ‘involuntary celibate’
and refers to online groups of men who feel that they cannot enter sexual relationships
and express hatred toward women accusing them of sexually manipulating or humiliat-
ing men. Other far-right groups enforce traditional gender roles and oppose abortion.
Far-right groups are usually strongly authoritarian, but often with populist elements and
have historically been anti-communist, although this characteristic has become less
prominent since the end of the Cold War. Not all groups or organizations with any one
of these characteristics can be considered far-right and not all far-right groups are auto-
matically violent or terroristic. However, terrorist groups with these characteristics and
individuals sympathetic to these ideals have been classified as “far-right terrorism”.
Far-right terrorists have a strong inclination to change the established order and favor
traditional aptitudes (typically white, heterosexual and Christian) and advocate the
forced establishment of authoritarian order. Far-right attacks are also less predictable as
perpetrators are typically unaffiliated with a terrorist group, making them harder to
detect. Far-right extremists have also shown a long-term interest in acquiring Chemical,
Biological, Radiological and Nuclear (CBRN) weapons, resulting in several CBRN far-
right terrorist plots in Western countries (mostly in the U.S.) which fortunately did not
come to fruition8. Another development is the phenomenon of individuals taking part
in a group of extremists performing terrorist acts without previous contacts to the
extremist environment, sometimes described as “Hive Terrorism”: terrorist acts or vio-
lent hate crimes committed by a spontaneously formed crowd that quickly disbands
after the incident.9 All the above appears to show a significant terrorist threat posed by
extreme right-wing activists and groups.
STUDIES IN CONFLICT & TERRORISM 3

The Attraction of Online Platforms


The far-right’s online presence had developed over three decades, using bulletin board systems,
websites, online forums, and more recently, social media.10 Social media has “algorithmically
amplified, sped up and circulated a political backlash by White voters that the alt-right has
exploited … ,making extreme viewpoints more tolerable in public discourse”.11 As Ganesh
argues, much of the far-right groups’ ability to manipulate public discourse is due to their
adoption of the practices and esthetics of misogynist, trolling, and gaming subcultures, where
they have honed their ability to use text, memes, and videos to use emotional appeals and
encourage participation with anti-immigrant and white supremacist discourse.12
The growing presence of extremist’s groups in cyberspace is at the nexus of two key
trends: the democratization of communications driven by user-generated content on the
Internet, and the growing awareness of modern vigilantes of the potential of the Internet
for their aims. Terrorists have used the Internet, as several studies have revealed, for
numerous purposes.13 They use the Net to launch psychological campaigns, recruit and
direct volunteers, raise funds, incite violence and provide training. They also use it to
plan, network, and coordinate attacks. Thus, not only has the number of terrorist online
platforms increased but also the ways in which terrorists use the Internet has diversified.
The network of computer-mediated communication (CMC) is ideal for extremists-as-
communicators: it is decentralized, cannot be subjected to control or restriction, is not cen-
sored and allows free access to anyone who wants it. The typical, loosely knit network of
cells, divisions, and subgroups of modern extremist organizations finds the Internet both
ideal and vital for inter- and intra-group networking. The great virtues of the Internet—ease
of access, lack of regulation, vast potential audiences, fast flow of information, and so
forth—have been converted into advantages for groups committed to terrorizing societies to
achieve their goals. The anonymity offered by the Internet is very attractive to modern radi-
cals, terrorists and vigilantes. Because of their extremist beliefs and values, these actors
require anonymity to exist and operate in social environments that may not agree with their
particular ideology or activities. The online platforms, from websites to social media and the
Dark Net, provide this anonymity and easy access from everywhere with the option to post
messages, to e-mail, to upload or download information and to disappear into the dark.
These advantages have not gone unnoticed by far-right groups, who moved their
communications, propaganda, instruction and training to the cyberspace. As Hoffman
and Ware concluded, “today’s far-right extremists, like predecessors from previous gen-
erations, are employing cutting-edge technologies for terrorist purposes”.14 The far-right
online presence is not restricted to a single online platform or space but is instead a
patchwork of various types of platforms and spaces, from websites to social media and
even the Dark Net. Far-right extremists are generating their content on a variety of
online platforms and increasingly also utilizing a wider range of new media technologies
for their purposes.15 A range of relatively new and highly accessible communication
“applications” is another component of this trend. One of them is TikTok.

TikTok’s Contents and Audiences


TikTok is a Chinese video-sharing platform owned by ByteDance, a Beijing-based com-
pany. It is used mostly to create short dance, lip-sync, comedy, and talent videos. Users
4 G. WEIMANN AND N. MASRI

can upload videos that typically run 15 s before looping to restart. They can also con-
nect clips to create videos up to 60 s long. Videos incorporate music samples, filters,
quick cuts, stickers and other creative add-ons that allow users to make the most of the
short length.
The app was launched in 2016 in China and then in 2017 for iOS and Android in
markets outside of China. In just two years, TikTok has emerged to rival companies
like Netflix, YouTube, Snapchat, and Facebook with more than one billion downloads
in 150 markets worldwide and 75 languages. The fresh, innovative and fast-moving con-
tent has hooked young audiences around the world. In the three years after it launched
in 2016, TikTok acquired 1.5 billion active users.16 In the first quarter of 2020 alone
TikTok amassed more than 315 million installs in the App Store and Google Play, more
than any other app ever in a quarter, resulting in over 2 billion downloads globally
since its launch. 17 TikTok has proven to attract the younger generation, as 41% of its
users are between the ages of 16 and 24. 18Among these TikTok users, 90% say they use
the app daily. Recently, TikTok has surpassed the likes of Facebook, Instagram,
Messenger, and Snapchat in the App Store and Google Play installs. Thus, TikTok has
become of the fastest growing social media platforms in the world.
TikTok’s mission is “to capture and present the world’s creativity, knowledge, and
precious life moments, directly from the mobile phone. TikTok enables everyone to be
a creator, and encourages users to share their passion and creative expression through
their videos.” What helps TikTok stand out among the competition is that practically
anyone can become a content provider because of the simplicity of using the app. That
is why it appeals to so many young users around the world.
The popularity of TikTok amongst the younger generation could also be explained by
the fact that the app creators decided to choose teenagers as their target audience from
the very beginning. By using a short video format unlike other social media sites such
as YouTube, TikTok can target teenagers and young adults with shorter attention spans.
It also encourages creativity with its wide range of effects and challenges and due to the
nature of its algorithm, it is easier to go viral on TikTok compared to other social net-
working websites.19 With teens as the target audience specified from the start, TikTok
designers matched the habits and preferences of this age group, thus creating a platform
gives their audience exactly what they like. It allows the younger crowd to express
themselves creatively, by dancing and singing, while using their favorite music clips.

The Dark Side of TikTok


The popular teenage platform has a dark side too. TikTok has been the subject of trou-
bling reports about its content, which is reportedly filled with nude images of children,
child predators, devious algorithms, lack of privacy, and teens bullying and harassing
one another.20 Moreover, the seemingly innocent video-sharing platform is hiding a
much more sinister side: it allows for a steady stream of drugs, predatory messages and
animal cruelty. Its lax security and control have allowed it to become a magnet for
pedophiles, profanity, crime, violence and extremism.
On 3 July 2018, TikTok was banned in Indonesia, after the Indonesian government
accused it of promulgating “pornography, inappropriate content and blasphemy.”
STUDIES IN CONFLICT & TERRORISM 5

Shortly afterwards, TikTok pledged to task 20 staff with censoring TikTok content in
Indonesia, and the ban was lifted on 11 July 2018. In February 2019, several Indian pol-
iticians called for TikTok to be banned or more tightly regulated, after concerns
emerged about sexually explicit content, cyberbullying, and deepfakes in which a person
in an existing image or video is replaced with someone else’s likeness. Extremist content
has also been lurking on TikTok. In October 2019, it was reported that ISIS recruitment
content was discovered on TikTok.21 The videos featured ISIS fighters with guns, corp-
ses, as well as videos of beheadings.22 Some of the videos reportedly targeted potential
future fighters, with shots glorifying militants set to ISIS songs. Others seemed to target
young girls, reportedly using the term “jihad lover,” and women calling themselves
“jihadist and proud.” The videos also used TikTok’s in-app features, like filters and
hearts as well as catchy music. On December 24, 2019, an ISIS supporter posted a
TikTok video featuring an audio message by an unknown ISIS fighter vowing that “the
Islamic State will remain” despite the online clampdown. The video concludes with a
recording of Al-Baghdadi’s voice, in which he says: “Steadfastness, and no retreat.”
Recently TikTok contains additional toxic contents, those of far-right groups. An
investigative report by the British Sun Online revealed that the app is exposing young
children and teens to a “cesspit of hate”.23 The reporters revealed a series of posts cele-
brating terrorist killings and promoting Holocaust denials and anti-Jewish conspiracy
theories. Scores of posts feature sickening anti-Semitic taunts - with cartoons depicting
Jewish men with large noses and joking about the Holocaust receiving hundreds of likes
and comments. In one video, racist sketches of characters labeled ‘A Sneaky Jew’ and
‘Mega Jew’ are followed by anti-Semitic tropes that Jewish people control the media,
financial sector and government.
Hate speech on TikTok largely flew under the radar until December 2019, when
Motherboard reported that it had found examples of “blatant, violent white supremacy
and Nazism,” including calls for the murders of Jews and black people.24 Some accounts
verbatim read “kill all n,” “all jews must die,” and “killn.” (The words are
uncensored on the app). The hate speech material on TikTok is varied. Some accounts
signaled support for Atomwaffen, a violent Neo-Nazi group linked to the murders of
several Jewish people across the United States. One account Motherboard found was
called “Race War Now.” The user profile photo of another account was of an offensive
caricature of a Jewish person, depicting a greedy rat. One video contained a succession
of users making Nazi salutes. Another video included the message, “I have a solution; a
final solution,” referring to the Holocaust. These reports, based on journalists’ explor-
ation of TikTok’s contents, call for systematic empirical research. The present study sets
out to study and analyze the presence of far-right extremism on TikTok.

Method
To scan TikTok for far-right contents we applied a systematic content analysis. The first
stage involved identifying TikTok accounts of known far-right groups. The list of far-
right groups includes neo-fascists, neo-Nazis, anti-Semites, white supremacists and other
extremist groups and organizations that feature ultranationalist, chauvinist, xenophobic,
theocratic, racist, homophobic, anti-communist, or reactionary views.25 Then we
6 G. WEIMANN AND N. MASRI

Table 1. Far-right postings on TikTok according to theme.


Topic No. of posts
Far right attackers 35
Anti-Semitism 43
Promoting violence 17
Hitler’s speeches 14
Sieg Heil 11
Featuring far-right flags & swastika’s 8
Racist 13
Homophobic 6
KKK-related 7
White pride/Chauvinism 8
Oswald Mosley speeches 16
Residuals 12
Note: Table includes multiple classifications of posts when they have multiple attributes.

collected posts on TikTok with hashtags associated with the far-right movements, such
as “red-pilled” or “based” and those mentioning known far-right groups. We then
examined these accounts as well as the accounts which showed interest in the topic
through liking, commenting or following these accounts. Far-right users were identified
by examining their avatars and profile photos as well as their usernames: many right-
wing users use flags of known far-right groups or images of far-right figures or symbols
as their avatars or profile photo, as well as usernames relating to the ideology. The scan
of TikTok videos was conducted during four months (February-May 2020).

Findings
Our systematic scan revealed 196 postings in total related to far-right extremism.
These encompassed the far-right ideologies of fascism, racism, anti-Semitism, anti-
immigration, chauvinism, nativism, and xenophobia. These postings ranged from
espousing violence, promoting conspiracy theories, and glorifying terrorists (Table 1).
The majority of far-right postings on TikTok related to anti-Semitism and Holocaust
denial and we found 43 postings of this kind. We also found 14 postings of Hitler’s
speeches including life in the Third Reich with captions such as “#hitler-
didnothing #NAZISMIS #HAIL”. We also found 16 postings of
Oswald Mosley’s speeches with comments such as “based and redpilled”. We also found
6 homophobic and 13 racist postings including a young boy preaching to his followers
that “white people need to start owning some black people, so take your black friends
and sell them”. We also found 11 postings of the Sieg Heil. Another trend was 17 vid-
eos, encouraging viewers to fight and take up arms, featuring far-right symbols such as
the swastika and the “sonnenrad”. The sonnenrad or black sun is an ancient European
symbol, which has been used by the SA and SS in Nazi Germany and has since been
hijacked by neo-Nazis and white supremacists. These types of postings tend to be color-
ful and some use popular music such as “The Hanging Tree” song from The Hunger
Games movie, thus appealing to a young audience. One video, from Atomwaffen
Division, a neo-Nazi terrorist network calls for a “race war now”. Another post in a
similar style states “the world is ending, all it needs is a little push … who will be the
one to push?”. This video is captioned “The #collapse is coming. Steel your #hearts.
Come talk to me about how you can keep yourself and your #people safe” which
STUDIES IN CONFLICT & TERRORISM 7

suggests that the far-right is trying to actively recruit members on TikTok with the cre-
ator stating “I have a telegram” and on a similar posting one user comments “you
know any chat rooms or telegram rooms with likeminded”.
We found several “combined” hate messages on TikTok, targeting multiple groups.
In one post, a man rants “no gays, no women, no jews. It’s so fucking simple like bing
bing boom”. Two other postings include individuals wearing the “moon man” face
mask, associated with the KKK movement, doing the Nazi salute.
TikTok users identify themselves in two ways, through a user handle (@johnsmith)
and a display name that appears on their profile such as “John Smith”. We also looked
at the comments and the number of “likes” on videos, as this is the main way for users
to express their sympathy. We found 29 accounts with user handles and display names
relating to “Adolf Hitler”. Another trend on TikTok was the number of accounts named
after far-right attackers, 2 accounts named after “Anders Breivik”, 8 accounts named
after “Brenton Tarrant” or “Saint Tarrant” as some refer to him, 2 accounts named after
“Dylann Roof”, 14 accounts named after “Elliot Rodger”, 1 account named after “Robert
Bowers” and 8 named after “Dylann and Eric Harris”. Many of these accounts posted
videos such as “free my homie [Brenton Tarrant] he did nothing wrong much love and
support to you” or glorifying the Columbine attack of 1999 “Happy Columbine
anniversary!”. We also found 13 accounts featuring words that are often associated with
the Klu Klux Klan such as “Moon Man” and “Wizard Man”. We also found 26 accounts
featuring the numbers “88” jin their username – the white supremacist numerical code
for “Heil Hitler”.
TikTok users also upload a profile picture, usually to represent their online persona.
We found multiple profile pictures featuring far-right symbols, including 4 accounts
using SS bolts, 3 accounts with the Stormfront logo, 3 pictures depicting the Totenkopf
used by Hitler’s SS, 12 accounts featuring swastikas, 5 accounts depicting the Nazi eagle
and 6 accounts depicting the Nazi flag. We also found 13 accounts depicting the son-
nenrad, prominently used in Nazi Germany, and today is popular amongst groups such
as Atomwaffen Division.
One trend on TikTok is the postings whose accounts are run by children and teen-
agers or presented as such. In one video a young boy rants that “black people need to
become property again … so like take your black friends and sell them”. Another young
boy posts a news article that India has several Nazi sympathizers and comments “me
and the boys going to India”. Another trend is the number of flags and other far-right
memorabilia identifiable in the background of TikTok postings. In one video, a hat
with the numbers “14/88” refers to “Heil Hitler” and advocates white supremacist ideas,
as well as the Confederate flag, Gadsden flag and British Union of Fascists flag can be
identified, despite being a proscribed group in the U.K. since 1940. The Gadsden flag
was designed during the American Revolution in 1775 with the rattlesnake representing
America’s ideals and is commonly perceived as a patriotic flag. However, it has also
been adopted by far-right political groups such as the Tea Party and Second
Amendment supporters.
In another posting the Nazi flag can be seen hanging on a wall alongside an Iron
Cross flag; a common hate symbol used by Neo-Nazis and white supremacists, receiving
comments such as “based flag”. Another posting shows men standing behind the
8 G. WEIMANN AND N. MASRI

Atomwaffen Division flag, a Neo-Nazi group, burning flags such as the LGTB rainbow
flag, and a Black Lives Matter flag.
Far-right terrorists and killers are also glorified on TikTok. In one posting, Brenton
Tarrant responsible for the 2019 Christchurch shootings in New Zealand killing 51 peo-
ple, is trivialized as the intro for a video game, entitled “Brenton Tarrant’s Extreme
Mosque Shooter” with the option to “start” and “load game” alongside a picture of
Tarrant. Another video referring to Tarrant entitled “free my homie, he did nothing
wrong much love and support to you” received comments such as “he’s crazy”, where
the creator defends him insisting “no he is not”. Three accounts profile pictures showed
their support for Anders Breivik, a far-right terrorist responsible for the 2011 Norway
attacks killing 77 people. Breivik is represented as a hero, with a halo, the Go-Pro cam-
era which he used to live-stream his attack on Facebook, a copy of his manifesto, a son-
nenrad see above comment which featured on the manifesto’s cover page, the rifle used
to commit the attack, a military helmet and the far-right symbol of the “sun cross” ver-
sion of the Celtic Cross. One account is dedicated entirely to posting videos of Dylann
Roof, an American white supremacist responsible for the Charleston church shooting of
2015, killing nine people, gaining over 240 likes across 16 videos. With captions such as
“I love him” and fans urging the account to post more videos of Roof. Elliot Rodger
also referred to as “the Supreme Gentleman”, was responsible for the 2014 Isla Vista
killings, an act of retribution against the woman that had rejected him and has been
described as an “incel hero”. We found four postings of Rodger’s “Retribution” video
which had been edited with effects and music. We also found ten other videos on
TikTok which posted photo montages and memes of Rodger, as well as comments on
TikTok which referring to him such as “I hate that he [Elliot Rodger] thinks he is the
only one dealing with this. Like what he did was not necessary”. Other postings on
TikTok suggest that there is an incel presence on TikTok. One post claims, “women are
evil all they do is break your heart” and is accompanied by a song entitled “women are
the root of all evil”. Another post entitled “Intro to the Woman Question” discusses
whether women test men and uses the example of Adam and Eve. This post received
comments such as “women are always the problem” and “women are easily corrupted
and manipulated”, “so since the beginning of time women been ruling the world?” and
“no women, no problem”.
Another trend is the number of postings encouraging their followers to use violence.
One video encourages its viewers to “fight back … they’ve taken your future … revolt
against the modern world”, in the same video the sonnenrad and SS bolts can be seen.
Another video also encouraging it’s viewers to fight, uses a similar font to Atomwaffen
Division videos and behind the image of a sonnenrad, it says “no country for Islam”,
this video uses bright colors and the song from the popular film, The Hunger Games,
thus trying to appeal and seduce children and teenagers
Some of the videos posted by users are followed by viewers’ comments. One com-
ment on a TikTok far-right video reads “finally a like-minded tik tok community”,
reflecting the desire to establish a virtual community on this platform. One post reads
“you can marry my daughter off to a black man, turn my son gay, which will ensure
the extinction of my race, but you better not take my guns”. This post received numer-
ous comments including “14” – short for the white supremacist slogan known as 14
STUDIES IN CONFLICT & TERRORISM 9

Words which states that “we must secure the existence of our people and a future for
white children”. Another comment reads “88”, the numerical code for “Heil Hitler”,
other comments include “no racemixing”, “HH” and “no, my daughter will not marry a
black man”. Another video captioned “boogwave #boogaloo” referring to the future civil
war depicts someone’s gun collection. This post received comments such as “I’m in”
and “now this is aesthetic”. Another posting, showing how easy it is to buy the
Gadsden flag on Amazon and buy guns online in the U.S., received 2273 likes and 14.8
thousand views and also received comments advising on how one could obtain a
machine gun online. One comment reads “I own a registered machine gun, all it took
was the cost of it, a $200 tax stamp, and to get it transferred to my name”, one user
commented, “you know way too much about this … are you going to do something?”.
Another comment reads “is there a nazbol house?” – referring to the “TikTok house”
established in the U.S. and U.K., where some of the country’s most popular TikTok
content creators live. In another video entitled “Erika” a cartoon of a bird is accompa-
nied by the music and lyrics to the song “Erika” by Herms Niel, who joined the Nazi
Party in Germany in 1933, this seemingly innocent video received 34.6 thousand likes
and comments such as “based”, “redpilled”, “best video on TikTok” and “they
[Germans] knew how to bake” as well as four users commenting “Heil” or “Sieg Hail”
“Hail Victory” and four users commented with the swastika symbol.
Far-right extremists have also taken advantage of TikTok’s features such as filters,
backgrounds, and effects. In one posting, a user opens an oven, with the text “me
kosher baking” and applies the “big nose filter” to himself to represent “everyone next
in line at Auschwitz”. This post received 181 likes and 2441 views, with comments such
as “they get what they deserve” and “alright it might have happened … but 6 million?”.
Another user uses the “duet” feature on TikTok to compare his far-right views to those
of a liberal by using the popular challenge “put a finger down” in which a user puts a
finger down for every time they agree with something the narrator says. In one post the
user is coaxed into agreeing that “immigrants deserve human rights”, that “black lives
matter” or that “women know what’s best for their bodies”. Another viral challenge on
the app is where users show images of boys they like to the song “I was busy thinking
bout Boys”. One far-right user uses this challenge to show images of Hitler and Oswald
Mosley, captioned “Lords, Saviours, Gods … BOYS”.
TikTok users can use hashtags in their post captions which enables them to trend on
the app. There appear to be no limitations on what users can hashtag their posts, write
in their bio or refer to themselves in their usernames. TikTok enables users to search
for hashtags, usernames, and songs, yet a search for “Adolf Hitler” or “KKK” will result
in “no results found – this phrase may be associated with hateful behaviour”. Yet, users
are still allowed to use these hashtags, they just cannot be searched for. Far-right users
also use hashtags that can be searched for, such as #heil or #hail, how-
ever, 66% of postings we found did not include any hashtags in their captions to avoid
detection. Another way users avoid detection is by taking advantage of the option to
add a “profile video” rather than a profile picture, in doing so users time their video to
show far-right flags or other content when searching for account names which are then
swiftly replaced another image when one clicks on their profile. For example, one
account’s profile video appears to be a swastika, but when one clicks on the photo, a
10 G. WEIMANN AND N. MASRI

completely different and platonic image is shown. Users also change their usernames
often to avoid detection. Another trend is the use of posts disguised as either Korean or
Japanese music videos with text promoting hatred rather than the original lyrics. One
video disguised as a music video of the South Korean pop band “Twice” claims that
“despite making up 12.5% of the population, black people commit over 56% of all vio-
lent crime”. Another “music video” claims that “the Holocaust didn’t happen”, while
another promotes the dangers of feminism.
TikTok users can search for specific hashtags, which will show the number of views
associated with the hashtag, as well as similar hashtags. Users are encouraged to use
multiple hashtags to increase the number of likes and views on a post, regardless of
their tenuous content. For example, the hashtag “fashwave” associated with the alt-right,
has 7333 views across 28 videos. The hashtag “boogaloo” used to reference a future civil
war has 12.8 thousand views across 21 videos. The hashtag “dylannroof” has 235,700
views across 27 videos. The hashtag “whitegenocide”, a conspiracy theory among white
supremacists has 61.5 thousand views across 15 videos. The hashtag “groyper” has 113.6
thousand views across 34 videos. The hashtag “hailhitler” has 3344 views across 18 vid-
eos. The hashtag “hailhitler” has 305 views across 4 videos.
TikTok’s “For You” page uses an algorithm to recommend videos to users. According
to TikTok’s listing in the iOS App Store, it is a “personalised video feed based on what
you watch, like, and share.” 26 The “For You” page proves that users don’t need mass
followers to go viral as the video will be shown to users who have shown interest in
similar videos on the app. The page is the first thing that opens when users open the
app, the videos are on auto-play making it hard for users not to start watching them,
users can swipe and watch unlimited videos based on the app’s algorithm. This is where
TikTok differentiates from newsfeeds on other social networking apps, as it is not based
on who you follow. In our study, we found that after reviewing videos of far-right
material, the “For You” page started recommending similar videos based on the content
we had been watching despite not interacting or following any of these users. TikTok’s
algorithms could push young users who accidentally or otherwise start watching such
content to be exposed to more of the same disturbing videos.

Conclusion
TikTok has spread like digital wildfire, snapping up over 2 billion downloads since its
global launch three years ago, including millions of young children, teenagers and ado-
lescents.27 This world’s fastest-growing social media platform presents short clips of lip-
syncing to songs or other sounds but there’s a far more sinister side. TikTok has
become a magnet for various malevolent predators including pedophiles, terrorists and
extremists. These predators exploiting the platform’s young audiences and lax security
to prey on the vulnerable. Our scan of TikTok revealed the disturbing presence of Far-
right extremism. These findings come amid growing calls for tighter social media regu-
lation. TikTok, the Chinese-owned app, claims it is “raw, real, and without boundaries”.
Yet, the lack of boundaries and the growing popularity of this platform, make it an
ideal hotbed for violent and extremist content.
STUDIES IN CONFLICT & TERRORISM 11

While similar concerns are raised with regard to other social platforms, TikTok has
unique features to make it more troublesome. First, unlike all other social media
TikTok’ s users are almost all young children, who are more naïve and gullible when it
comes to malicious content. Second, TikTok is the youngest platform thus severely lag-
ging behind its rivals, who have had more time to grapple with how to protect their
users from disturbing and harmful content. Nevertheless, in TikTok’s Terms of Service,
it is stated that user may not

 Intimidate or harass another, or promote sexually explicit material, violence or


discrimination based on race, sex, religion, nationality, disability, sexual orienta-
tion or age; or post
 any material which is defamatory of any person, obscene, offensive, porno-
graphic, hateful or inflammatory;
 any material that would constitute, encourage or provide instructions for a crim-
inal offense, dangerous activities or self-harm;
 any material that is deliberately designed to provoke or antagonize people, espe-
cially trolling and bullying, or is intended to harass, harm, hurt, scare, distress,
embarrass or upset people;
 any material that contains a threat of any kind, including threats of phys-
ical violence;
 any material that is racist or discriminatory, including discrimination on the
basis of someone’s race, religion, age, gender, disability or sexuality.28

Our findings as well as other studies reveals TikTok’s inability to impose these self-
guideline. Moreover, the special features of TikTok — as a video platform where users
watch content from strangers, not just that of a network of chosen friends — lends itself
to new ways of spreading abuse and evading detection. There are fewer keywords to
monitor, making it more difficult for computers to automatically flag posts. Our scan
required very complex methods to reveal extremist postings, due to the users’ sophisti-
cation (combining names, words, symbols and codes in a way that will make it impos-
sible for automatic crawlers to find). There has also been evidence that some bad
actors, including white supremacists, have been piggybacking on to trending topics to
spread their posts. TikTok’s recommendation algorithm is another disconcerting factor:
this algorithm could push users who unintentionally see disturbing content to see more.
So once users have been exposed to one extremist video, the likelihood they are shown
more of that content is extremely high, due to the algorithm’s function.
Finally, TikTok has reportedly had major security vulnerabilities. According to
research published in 2020 by the Israeli cybersecurity company Check Point, these
flaws in the app allow hackers to send users messages containing links that could be
leveraged to usurp user accounts and gain access to private videos. Check Point
researchers also discovered they were able to access users’ personal information through
the company’s website, identifying yet another blatant vulnerability. Given how young
many of TikTok’s users are, these security issues are that much more troubling. In 2019
TikTok paid a $5.7 million fine to the U.S. government. The fine, paid to settle viola-
tions identified by the FTC, involved the illegal collection of “names, email addresses,
12 G. WEIMANN AND N. MASRI

pictures and locations of kids under age 13”. The Children’s Online Privacy Protection
Act, or COPPA, forbids social media companies from collecting the data of children
without explicit parental consent. Failing to obtain that consent would be in violation of
the law and open the company up to potential lawsuits from regulators like the Federal
Trade Commission. Since TikTok, according to our scan is allowing Far-right extremists
to use its platform, or at least fails to prevent it, future legal steps may force the com-
pany to apply better monitoring and real regulation.

Notes
1. Jesselyn Cook, “Far-Right Activists Are Taking Their Message To Gen Z On TikTok,”
Huffpost, 16 April 2019, https://www.huffpost.com/entry/far-right-tiktok-gen-z_n_
5cb63040e4b082aab08da0d3 (accessed 24 May 2020); Sara Manavis, “ How the alt-right is
pivoting to TikTok,” NewStatesman, 27 April 2020, https://www.newstatesman.com/science-
tech/social-media/2020/04/how-alt-right-pivoting-tiktok-tommy-robinson-britain-first; and
Kate Plummer (accessed 24 May 2020), “ TikTok investigates far-right infiltration as hate
videos rack up nearly a million views,” Scram, 17 April 2020, https://scramnews.com/tiktok-
investigates-britain-first-tommy-robinson-far-right-hate-videos/ (accesses 24 May 2020).
2. Institute for Economics & Peace, “Global Terrorism Index 2019: Measuring the Impact of
Terrorism,” (Sydney, 2019), http://visionofhumanity.org/app/uploads/2019/11/GTI-2019web.
pdf (accessed 15 May 2020).
3. Institute for Economics & Peace, “Global Terrorism Index 2018: Measuring the Impact
of Terrorism” (Sydney, 2018), http://visionofhumanity.org/app/uploads/2018/12/Global-
Terrorism-Index-2018-1.pdf (accessed 15 May 2020).
4. “Murder and Extremism in the United States in 2018,” Anti-Defamation League, last
modified 2019, https://www.adl.org/murder-and-extremism-2018 (accessed 15 May 2020).
5. Institute for Economics & Peace, “Global Terrorism Index 2019: Measuring the Impact
of Terrorism.”
6. Institute for Economics and Peace (IEP), “Far-Right Attacks in the West Surge by 320 Per
Cent,” Vision of Humanity, http://visionofhumanity.org/global-terrorism-index/far-right-
attacks-in-the-west-surge-by-320-per-cent/ (accessed 24 May 2020).
7. Bruce Hoffman and Jacob Ware, “Incels: America’s Newest Domestic Terrorism Threat,”
Lawfare, 12 January 2020, https://www.lawfareblog.com/incels-americas-newest-domestic-
terrorism-threat, (accessed 24 May 2020).
8. Daniel Koehler and Peter Popella, “Mapping Far-right Chemical, Biological, Radiological,
and Nuclear (CBRN) Terrorism Efforts in the West: Characteristics of Plots and
Perpetrators for Future Assessment,” Terrorism and Political Violence (2018), doi: https://
www.tandfonline.com/action/showCitFormats?doi=10.1080/09546553.2018.1500365.
9. Daniel Koehler, “Right-Wing Extremism and Terrorism in Europe”, Prism 6, no. 2
(2016): 84–105.
10. Les Black, “Aryans Reading Adorno: Cyber-Culture and Twenty-First Century
Racism,” Ethnic and racial studies 25, no. 4 (2002): 628–51, doi: https://doi.org/10.1080/
01419870220136664; Val Burris, Emery Smith and Ann Strahm, “White Supremacist
Networks on the Internet,” Sociological focus 33, no. 2 (2000): 215–35; https://www.jstor.
org/stable/20832076; Susan Zickmund, “Approaching the Radical Other: The Discursive
Culture of Cyberhate,” in Virtual Culture: Identity and Communication in Cybersociety, ed.
Steven Jones (London: SAGE Publications, 2002), 185–206.
11. Jessie Daniels, “The Algorithmic Rise of the ‘Alt-Right,’” Contexts 17, no. 1 (2018): 60–5.
doi: 10.1177/1536504218766547.
12. Bharath Ganesh, “Weaponizing White Thymos: Flows of Rage in the Online Audiences of
the Alt-Right,” Cultural Studies (2020): 1–33, doi: https://doi.org/10.1080/09502386.
2020.1714687.
STUDIES IN CONFLICT & TERRORISM 13

13. Gabriel Weimann, Terror on the Internet: The New Arena, The New Challenges
(Washington, DC: United States Institute of Peace Press, 2006); Gabriel Weimann, Terror in
Cyberspace: The Next Generation (New York: Columbia University Press, 2006); Gabriel
Weimann, “The Evolution of Terrorist Propaganda in Cyberspace,” in The SAGE Handbook
of Propaganda, ed. Paul Baines, Nicholas O’Shaughnessy and Nancy Snow (London: Sage
Publications Ltd, 2020), 579–94.
14. Bruce Hoffman and Jacob Ware, “Are We Entering A New Era of Far-Right Terrorism?”
War on the Rocks, last modified 27 November 2019, https://warontherocks.com/2019/11/
are-we-entering-a-new-era-of-far-right-terrorism/ (accessed 15 May 2020).
15. Gabriel Weimann and Natalie Masri, “The Virus of Hate: Far-Right Terrorism in
Cyberspace,” International Institute for Counter-Terrorism, last modified 5 April 2020,
https://www.ict.org.il/images/Dark%20Hate.pdf (accessed 15 May 2020).
16. “20 TikTok Statistics Marketers Need to Know: TikTok Demographics & Key Data,”
Mediakix, last modified, 13 December 2019, https://mediakix.com/blog/top-tik-tok-statistics-
demographics/ (accessed 15 May 2020).
17. Craig Chapple, “TikTok Crosses 2 Billion Downloads After Best Quarter For Any App
Ever,” Sensor Tower, 15 April 2020, https://sensortower.com/blog/q1-2020-data-digest
(accessed 23 May 2020).
18. Chris Beer, “Is TikTok Setting the Scene for Music on Social Media?,” Global Web Index, 3
January 2019, https://blog.globalwebindex.com/trends/tiktok-music-social-media/ (accessed
23 May 2020).
19. Kevin Leander, and Sarah K. Burriss. “Critical Literacy for a Posthuman World: When
People Read, and Become, with Machines,” British Journal of Educational Technology, 13
March 2020, https://onlinelibrary.wiley.com/doi/abs/10.1111/bjet.12924 (accessed 24
May 2020).
20. Joseph Cox, “TikTok, the App Super Popular With Kids, Has a Nudes Problem,” Vice, 6
December 2018, https://www.vice.com/en_us/article/j5zbmx/tiktok-the-app-super-popular-
with-kids-has-a-nudes-problem (accessed 15 May 2020); Owen Phillips, “The App That
Exposes Teens to Catcalls and Harassment,” OneZero, 26 September 2018, https://onezero.
medium.com/the-app-that-exposes-teens-to-catcalls-and-harassment-tiktok-musically-d98be52c6ff1
(accessed 15 May 2020).
21. William Feuer, “TikTok Removes Two Dozen Accounts Used for ISIS Propaganda,” CNBC,
21 October 2019, https://www.cnbc.com/2019/10/21/tiktok-removes-two-dozen-accounts-
used-for-isis-propaganda.html (accessed 15 May 2020).
22. Georgia Wells, “Islamic State’s TikTok Posts Include Beheading Videos,” The Wall Street
Journal, 23 October 2019, https://www.wsj.com/articles/islamic-states-tiktok-posts-include-
beheading-videos-11571855833 (accessed 15 May 2020).
23. Richard Wheatstone and Ciaran O’Connor, “TikTok Swamped with Sickening Videos of
Terror Attacks Murders, Holocaust Denials and Vile Racist Slurs,” The Sun Online, 1 March
2020, https://www.thesun.co.uk/news/10962862/tiktok-extremist-racist-videos-anti-semitism/
(accessed 15 May 2020).
24. Joseph Cox, “TikTok Has a Nazi Problem,” Motherboard, 18 December 2018, https://www.
vice.com/en_us/article/yw74gy/tiktok-neo-nazis-white-supremacy (accessed 15 May 2020).
25. The list of extremists groups was based on lists collected by the SPLC (Southern Poverty
Law Center), published at https://www.splcenter.org/fighting-hate/extremist-files/groups, the
extremist groups included in the Europe’s Right Wing: A Nation-by-Nation Guide to
Political Parties and Extremist Groups published at http://content.time.com/time/specials/
packages/completelist/0,29569,2085728,00.html the START’s Proles of Individual
Radicalization in the United States (PIRUS) at https://www.start.umd.edu/profiles-
individual-radicalization-united-states-pirus-keshif, database and the list of CEP (Counter
Extremism Project), published at https://www.counterextremism.com/sites/default/files/
supremacy_landing_files/U.S.%20White%20Supremacy%20Groups_022620.pdf
26. “TikTok – Make Your Day, App Store,” https://apps.apple.com/us/app/id835599320
(accessed 15 May 2020).
14 G. WEIMANN AND N. MASRI

27. Craig Chapple, “TikTok Crosses 2 Billion Downloads After Best Quarter For Any
App Ever.”
28. TikTok Terms of Service, last modified February 2019, https://www.tiktok.com/legal/terms-
of-use?lang=en (accessed 15 May 2020).

Disclosure statement
No potential conflict of interest was reported by the author(s).

View publication stats

You might also like