You are on page 1of 32

978-9934-564-58-1

MS
P

FALLING BEHIND:
HOW SOCIAL MEDIA COMPANIES ARE FAILING
TO COMBAT INAUTHENTIC BEHAVIOUR ONLINE

PREPARED BY THE
NATO STRATEGIC COMMUNICATIONS
CENTRE OF EXCELLENCE

���������������������������������������������������������������������������� 1
ISBN: 978-9934-564-58-1

Authors: Sebastian Bay, Rolf Fredheim


Research: Singularex
Project manager: Sebastian Bay
Contributors to the Project: Linda Curika, Rueban Manokara
Copy-editing: Anna Reynolds
Design: Kārlis Ulmanis

This report was completed in November 2019, and based on an experiment that was conducted
May - August 2019.

Singularex is a Social Media Intelligence and Analytics company based in Kharkiv, Ukraine.
Website: www.singularex.com
Email: hello@singularex.com

NATO STRATCOM COE


11b Kalciema Iela
Riga LV1048, Latvia
www.stratcomcoe.org
Facebook/stratcomcoe
Twitter @stratcomcoe

This publication does not represent the opinions or policies of NATO or NATO StratCom COE.
© All rights reserved by the NATO StratCom COE. Reports may not be copied, reproduced,
distributed or publicly displayed without reference to the NATO StratCom COE. The views
expressed here do not represent the views of NATO.

2 �����������������������������������������������������������������������������
EXECUTIVE SUMMARY

From the 2014 invasion of Ukraine to more At a cost of just 300 EUR, we bought 3 530
recent attempts to interfere in democratic comments, 25 750 likes, 20 000 views, and 5
elections, antagonists seeking to influence 100 followers. By studying the accounts that
their adversaries have turned to social media delivered the purchased manipulation, we
manipulation. were able to identify 18 739 accounts used to
manipulate social media platforms.
At the heart of this practice is a flourishing
market dominated by Manipulation Service In a test of the platforms’ ability to inde-
Providers (MSPs) based in Russia. Buy- pendently detect misuse, we found that four
ers range from individuals to companies to weeks after purchase, 4 in 5 of the bought in-
state-level actors. Typically, these service authentic engagements were still online. We
providers sell social media engagement in the further tested the platforms ability to respond
form of comments, clicks, likes, and shares. to user feedback by reporting a sample of the
fake accounts. Three weeks after reporting
Since its foundation, the NATO Strategic Com- more than 95% of the reported accounts were
munication Centre of Excellence in Riga has still active online.
studied social media manipulation as an im-
portant and integral part of the influence cam- Most of the inauthentic accounts we monitored
paigns malicious state and non-state actors remained active throughout the experiment.
direct against the Alliance and its partners. This means that malicious activity conducted
by other actors using the same services and
To test the ability of Social Media Compa- the same accounts also went unnoticed.
nies to identify and remove manipulation, we
bought engagement on 105 different posts on While we did identify political manipulation—
Facebook, Instagram, Twitter, and YouTube as many as four out of five accounts used for
using 11 Russian and 5 European (1 Polish, 2 manipulation on Facebook had been used to
German, 1 French, 1 Italian) social media ma- engage with political content to some extent—
nipulation service providers. we assess that more than 90% of purchased
engagements on social media are used for
commercial purposes.

3 �����������������������������������������������������������������������������
We identified fake engagement purchased port presents a different perspective: We were
for 721 political pages and 52 official govern- easily able to buy more than 54 000 inauthen-
ment pages, including the official accounts tic social media interactions with little or no
of two presidents, the official page of a Eu- resistance.
ropean political party, and a number of junior
and local politicians in Europe and the United Although the fight against online disinforma-
States. The vast majority of the political ma- tion and coordinated inauthentic behaviour is
nipulation, however, was aimed at non-west- far from over, an important finding of our ex-
ern pages. periment is that the different platforms aren’t
equally bad—in fact, some are significantly
We further assessed the performance of the better at identifying and removing manipula-
four social media companies according to tive accounts and activities than others. In-
seven criteria designed to measure their abil- vestment, resources, and determination make
ity to counter the malicious use of their ser- a difference.
vices. Overall, our results show that the social
media companies are experiencing signifi-
cant challenges in countering the malicious
use of their platforms. While they are better
at blocking inauthentic account creation and Based on our experiment,
removing inauthentic followers, they are not we recommend:
doing nearly as well at combating inauthentic
comments and views.
1. Setting new standards and
Based on this experiment and several other requiring reporting based on
studies we have conducted over the last two more meaningful criteria
years, we assess that Facebook, Instagram,
Twitter, and YouTube are still failing to ad- 2. Establishing independent and
equately counter inauthentic behaviour on well-resourced oversight of
their platforms. the social media platforms

Self-regulation is not working. The manipula- 3. Increasing the transparency


tion industry is growing year by year. We see of the social media platforms
no sign that it is becoming substantially more
expensive or more difficult to conduct wide- 4. Regulating the market for
spread social media manipulation. social media manipulation

In contrast with the reports presented by the


social media companies themselves, our re-

���������������������������������������������������������������������������� 4
Social media manipulation is the new frontier
for antagonists seeking to influence elections,
polarise public opinion, and side-track legitimate
political discussions.

INTRODUCTION

Social media manipulation is the new fron- line to influence public perception of trends
tier for antagonists seeking to influence and popularity. Some inauthentic accounts
elections, polarise public opinion, and side- are simple, bot-controlled [short for robot]
track legitimate political discussions. accounts without pictures or content that
view videos or retweet content following a
A new industry has developed to feed the computer program. Others are elaborate or
market for inauthentic comments, clicks, ‘aged’ accounts with long histories meant to
likes, and followers. The first Manipulation be indistinguishable from genuine users.
Service Providers (MSPs) to cater to this new
need appeared in Russia, but opportunistic Bots are a very cost-efficient way of generat-
MSPs soon began appearing in Europe, of- ing artificial reach and creating a wave of ‘so-
ten simply reselling Russian-based services. cial proof’ as typical users are more likely to
trust and share content that has been liked
Buyers range from individuals seeking to by many others. Bot-controlled accounts
boost their popularity to influencers gaming cost only a few cents each and are expected
the online advertising system to state-level to be blocked quickly. More elaborate inau-
actors with political motivations. Social me- thentic accounts require some direct human
dia manipulation relies on inauthentic ac- control. They can cost up to several hundred
counts that engage with other accounts on- euros and often remain online for years.

5 �����������������������������������������������������������������������������
The ‘Black market’ for social media
manipulation
The scale is greater than thought.
The infrastructure for developing and
maintaining social media manipulation
software, generating fictitious accounts, and
providing mobile proxies is vast .

The openness of this industry is striking.


Rather than a shadowy underworld, it is an
easily accessible marketplace that most web
users can reach with little effort through
any search engine. In fact, manipulation
service providers advertise openly on major
platforms.

Russian service providers dominate the


social media manipulation market. Virtually
all of the major manipulation software and
infrastructure providers identified by us are
of Russian origin.

The size of the social media manipulation


industry is troubling. We have identified
hundreds of providers. Several have many
employees and significant revenue. It is clear
that the problem of inauthentic activity is
extensive.

6���������������������������������������������������������������������������� 6
impersonation policies. Facebook
Commitments to prevent disabled 2.19 billion inauthentic
platform abuse accounts in the first quarter of 2019
and acted specifically against 1,574
Social media companies have made several non-EU-based and 168 EU-based
formal statements expressing their intent to pages, groups and accounts engaged
tackle abuse of their platforms. The clearest in inauthentic behaviour targeting EU
formal commitment occurred in September Member States. Twitter challenged
2018, when representatives of the major on- almost 77 million spamlike or
line platforms agreed on a self-regulatory inauthentic accounts globally.3
Code of Practice to address the spread of on-
line disinformation.1 These numbers are impressive, but them-
selves, they do no prove effectiveness. It is
One important part of the Code of Practice important to evaluate if the social media
was a commitment to put into place clear pol- companies are truly living up to their com-
icies for identifying and handling the misuse mitments, and to independently verify their
of automated bots, and to enforce these poli- ability to counter misuse of their platforms.
cies within the European Union.
In this report, we use experimental methods
The European Commission has urged social to identify how hard it is to circumvent the
media companies to step up their efforts and, measures that should now have been adopt-
in view of the 2019 European elections, the ed. Without access to data from the social
Commission and the European Regulators media companies, we had to develop a much
Group for Audiovisual Media Services (ERGA) more creative approach to the problem.
assessed the actions taken by Facebook,
Google, and Twitter based on reports submit- Building on our previous work on the ‘black
ted by these platforms to the Commission.2 market’ for social media manipulation, we
decided to use services selling inauthentic
The Commission notes that the social media social media interactions to our benefit. The
platforms reported that they had taken ac- scale and effectiveness of the market for ma-
tion against inauthentic behaviour to limit the nipulation enables experiments to test and
scope of spam and disinformation globally. assess the ability of individual social media
companies to counter manipulation.
Google reported to have globally
removed more than 3.39 million
Youtube channels and 8,600 channels
for violations against its spam and

7 �����������������������������������������������������������������������������
Who we are
The NATO Strategic Communications Centre hybrid threats, enhancing resilience, building
of Excellence is a multi-nationally constituted defence capacity, maintaining and improving
and NATO-accredited international military cyber defence.5 Measures to bolster resilience
organisation. We are not part of the NATO to hybrid threats—from disinformation
Command Structure, and are not subordinate campaigns to emergent crises—are an
to any other NATO entity. essential part of NATO-EU cooperation today.

Our strength is built by multinational and We developed this experiment in support


cross-sector participants from the civilian, of the European Union Action Plan against
military, private, and academic sectors and Disinformation6 and the self-regulatory Code
from the use of modern technologies, virtual of Practice7 to address the spread of online
tools for analysis, research, and decision disinformation.
making.
The malicious use of social media has shown
NATO and the European Union (EU) are to be an important tool for actors conducting
essential partners who have developed influence activities against the interests of
a closer cooperation to improve security the EU and NATO. Bolstering our collective
for European citizens during the last few resilience requires a deeper understanding of
years—with joint declarations made to that this problem space so that we can establish
effect in 2016 and 2018.4 In Warsaw in July effective analysis, prevention, and early
2016, the two organisations outlined areas detection. This will only be possible if we
for strengthened cooperation in light of identify and address the vulnerabilities of
common challenges to the east and south. social media platforms.
Areas of cooperation include countering

���������������������������������������������������������������������������� 8
Spending 300 EUR, we bought 3 530
comments, 25 750 likes, 20 000 views, and
5 100 followers, enabling us to identify 18 739
inauthentic accounts being used for social media
manipulation.

SOCIAL MEDIA MANIPULATION


EXPERIMENT

vast majority of the experiment by buying


Introduction to the engagement on inauthentic profiles we
experiment created ourselves.

To test the ability of Facebook, Instagram, To assess if there is a difference between the
Twitter, and Youtube to identify and remove various platforms’ ability to counter bought
manipulation on their platforms we bought manipulation on verified accounts, we also
comments, views, and likes from a range of purchased comments and likes on a few real
European and Russian social media manipu- verified posts on each platform.
lation service providers.
To make sure that we did not influence real
We structured the experiment so that we conversations we only bought engagement
could measure and evaluate the performance on posts that were at least six months old
of both manipulation service providers and and contained neutral apolitical messages
social media platforms. such as New Year’s greetings.

To limit the risk of unintentionally influencing The comments we bought were simple
real conversations online, we conducted the messages of a positive nature such as

9 �����������������������������������������������������������������������������
‘Hello!’ and ‘Thank you!’ (see case-study on to identify and block bought manipulation it
page 25). Engaging with posts that likely was important that our experiment did not
would not receive genuine engagement also prompt users or account managers to report
enabled more accurate measurement of the our activity to the social media companies as
purchased engagement. this would have “poisoned” our data. We did
not want to test the ability of social media
As the context is also important, we chose managers or the public to detect and report
to collect our data in the context of an elec- inauthentic activity.
tion—a time when platforms had committed
to be especially vigilant. The findings present- We bought our engagement from commercial
ed below, therefor, represent something of a manipulation service providers. This means
best-case scenario for the social media com- that failure to remove the inauthentic accounts
panies as they had committed to dedicating and inauthentic engagement we bought
extra resources to prevent abuse during this means that malicious activity conducted by
time period. other actors using the same services and the
same accounts also did not get removed. Our
It could be argued that bought manipulation experiment, therefore, offers insight into the
is more likely to be detected if it is placed on ability of the social media companies to deal
current content, but because we wanted to with the commercial manipulation industry.
test the ability of the social media companies

Buying social media Tracking ability of


manipulation, tracking social media companies
delivery and ability of to remove reported
social media platform to confirmed inauthentic accounts
identify and remove
the manipulation
July August

May - June July-August


Data analysis
Reporting a random and verification

2019
sample of the identified
inauthentic accounts
to the social media
companies

��������������������������������������������������������������������������� 10
gagement. We reported the inauthentic engage-
The scale of the ment to the social media companies in July and
experiment continued monitoring through the end of August
2019 to measure the time it took for the social
media platforms to react.
To conduct the experiment we bought en-
gagement on 105 different posts on Face- During the experiment, we recorded statistics
book, Instagram, Twitter, and YouTube using on how quickly the manipulation service pro-
11 Russian and 5 European (1 Polish, 2 Ger- viders were able to deliver their services, and
man, 1 French, 1 Italian) social media manip- whether the quantity delivered was accurate.
ulation service providers. Spending 300 EUR, We then collected data on how the four social
we bought 3 530 comments, 25 750 likes, 20 media platforms responded to the manipulat-
000 views, and 5 100 followers, enabling us to ed content by periodically measuring whether
identify 18 739 accounts being used for social it had been removed.
media manipulation.
The experiments were divided into several
The experiment was carried out during six weeks blocks of work, visualised below. In the follow-
in May and June 2019. To assess the ability of ing chapters we provide a detailed analysis
the platforms to remove the inauthentic engage- of the ability of individual social media com-
ment, we monitored the bought engagement panies to detect and counter manipulation of
from before engagement to one month after en- their services.

Five steps of the


experiment
Buying likes, comments,
views, and followers for
neutral posts on our own
inauthentic accounts

Tracking how long Buying likes,


it takes to remove comments, views,
accounts after and followers for
reporting a random neutral apolitical
sample posts

Tracking how long Tracking performance


inauthentic accounts and response time of
stay on the social media platforms in removing
platform and with what inauthentic activity
they engage

11 ����������������������������������������������������������������������������
the creation of inauthentic accounts and the
Our assessment criteria ability of platforms to recognise and remove
coordinated inauthentic behaviour both inde-
We assessed the performance of the four pendently and after such behaviour had been
social media companies studied according reported.
to seven criteria designed to measure their
These criteria can also serve as general
ability to counter the malicious use of their
benchmarks for assessing the ability of plat-
services.
forms to counter social media manipulation.
These criteria focus on detailed aspects of
social media manipulation including blocking

1 Inauthentic accounts are critical


Success in for the functioning of
blocking the manipulation services, and
creation of platforms aim to prevent their

2
inauthentic creation. Blocking accounts raises
accounts the barrier for manipulation,
making it more difficult and costly.
This ability is important to Ability to
combat the spread, impact, and detect and
remove
‘time-on-platform’ of inauthentic

3
inauthentic
activity. accounts

Ability to Given the speed of social


detect and media, timely detection is
remove important for limiting the
effects of social media
4
inauthentic
activity manipulation.
When inauthentic accounts are
Ability to remove
removed, it is important that the traces of
activities they have performed inauthentic

5
are also removed. Not all accounts
platforms succeed at this.

Cost of The more costly it is to buy


purchasing manipulation, the less likely it
manipulation is that large scale campaigns

6
will be carried out.

Rapid and successful delivery of


manipulation indicate that a
platform has insufficient Speed of
protection. Slow delivery delivery

7
indicates providers need to
drip-feed interventions to avoid
As a last resort, platforms turn anti-manipulation efforts.
Responsiveness to user moderation to detect
to reports of fraudulent activity. The ability
inauthentic of the platforms to quickly
activity
assess and respond to reports
is an important part of
combating platform abuse.

��������������������������������������������������������������������������� 12
The experience of creating our own inauthen-
Assesment of social media tic accounts allows us to conclude that it is
company ability to respond becoming far more difficult for the average
user to create inauthentic accounts on the
to inauthentic social media platforms we tested. However,
behaviour online the measures they use are not robust enough
to stop persistent users or organisations. Of
Blocking inauthentic course, this is a step in the right direction, but
account creation (Criteria 1) more needs to be done to block inauthentic
accounts from being created. YouTube espe-
In order to conduct the experiment, we had to cially needs to improve its efforts as it is by
set up our own inauthentic accounts. These far the easiest platform to create inauthentic
accounts were used to upload content, which accounts on.
we then manipulated using Manipulation Ser-
vice Providers (MSPs). Creating inauthentic Countering manipulation
accounts is becoming harder to do as some (Criteria 2 - 4)
of the social media platforms have stepped up
their efforts to combat inauthentic accounts. We assessed three criteria for the ability of
From our own pool of inauthentic accounts social media companies to undo the effects
Facebook suspended 80 percent, Twitter—66 of manipulation. First, the removal of bought
percent, and Instagram—50 percent. YouTube activity, such as comments or likes. Second,
did not suspend any of our accounts. the removal of the accounts used to deliver
the manipulation. Finally, undoing all the ef-
By actively monitoring our own settings during fects created by inauthentic accounts. More
account creation we were able to identify why advanced effect management could include
we were blocked. The reasons varied from case notifications to account owners that their
to case, but included web browser cookies and content has been manipulated, informing us-
the use of specific IP-addresses. Manually con- ers who have been exposed to manipulation,
tacting support also allowed us to unblock ac- and informing the public of significant cases.
counts we needed for the experiment.
While the different social media companies
The systems used by Facebook, Instagram, each have their strengths and weaknesses,
and Twitter for protecting against the registra- one platform performs poorly no matter the
tion of multiple (inauthentic) accounts from a criterion—Instagram was largely unable to de-
single IP address or a VPN are generally effec- tect and counter any manipulation. Instagram
tive. The measures put into place by Twitter managed to remove only 5% of bought inau-
to prevent inauthentic account creation were thentic comments and virtually none of the
especially hard to circumvent. inauthentic likes or views were corrected.

13 ����������������������������������������������������������������������������
YouTube is the only platform that succeed- at combating manipulation. When accounts
ed in reducing manipulated view counts. used to perform manipulation are removed,
MSPs have to spend time and money to re-
Across all platforms, the first decrease of pur- place them. When social media platforms
chased engagement is most often recorded redesign to break the scripts used to seed
on the third to the fifth day after the purchase, manipulation, MSP developers have to update
indicating that even when manipulation is re- their scripts. These costs are passed on to
moved it is often too slow to be effective. their consumers.

Cost of purchasing For only 300 EUR, we were able to buy 3 530
manipulation (Criteria 5) comments, 25 750 likes, 20 000 views, and
5 100 followers. A high proportion of this
To assess the cost of the services we sam- cost went to the more expensive and often
pled the offerings of five Russian providers. less reliable European providers.

The figure below show how many likes, com- To make a balanced assessment of the rela-
ments, views, or followers it is possible to buy tive cost of manipulating the platforms, we
for 10 EURO. The firgures represent the aver- identified five inexpensive and reliable Rus-
age price quoted by five core Russian MSPs. sian MSPs that also provided a significant
part of the manipulation services resold by
The cost of manipulation is a good indicator other MSPs.
of how effectively social media platforms are

781 likes
1204 likes HOW MUCH MANIPULATION
2173 likes
4000 likes
CAN YOU BUY FOR 10 EURO?
153 comments Buying views
131 comments on YouTube
is more expensive
119 comments
than on other
212 comments platforms
3267 views
4347 views
11 627 views
13 158 views
458 followers
Overall,
990 followers manipulating
2439 followers Instagram
3846 followers is the cheapest

0 2000 4000 6000 8000 10000 12000 14000

YouTube Facebook Twitter Instagram

��������������������������������������������������������������������������� 14
Overall, YouTube is the most expensive ser- than five minutes on Instagram. Twitter was
vice to manipulate. The cost of manipulation generally the slowest with bought manipula-
services for Twitter and Facebook is roughly tion first appearing after one hour and com-
similar, though it varies slightly depending on pleting within two days on average.
the service used. The cost of manipulation
services for Instagram is nearly half that of There is a considerable amount of cross-ac-
the same manipulation for Twitter and Face- tivity in the manipulation industry. We not-
book and only a fifth of the cost of YouTube ed that different European providers would
manipulation. often use the same inauthentic accounts,
and these accounts were often of Russian
Manipulation of YouTube is the most ex- and Ukrainian origin, indicating that many of
pensive for everything except comments, the European MSPs use the same Russian
whereas manipulation on Instagram is the sub-providers.
cheapest in every category.
While most manipulation service providers
Speed of delivery are quite reliable, the volumes delivered were
(Criteria 6) often not what had been bought. Sometimes
we received fewer, but most often we re-
Social media manipulation services are ceived more. This may be because providers
widely available, accessible, and profession- know some of their manipulation efforts will
al. Almost all of the manipulation service disappear—indeed at least one supplier of-
providers we used were highly responsive to fers a monthly guarantee, promising to peri-
questions and complaints indicating that the odically ‘top-up’ numbers if the social media
manipulation service industry has managed company counters the manipulation.
to develop into a reasonably reliable industry.
However, we believe the real explanation is
We found that Instagram manipulation ser- that many providers resell services offered
vices overall were the most reliable, while by others and therefore are unsure exactly
comment services for Facebook, Twitter, and how many interventions will be delivered in
Youtube were the least reliable. a timely fashion.

With the exception of Twitter, all the bought This argument is supported by the consider-
manipulation was delivered within 24 hours able overlap in account use between provid-
on average. While the average time of com- ers. On Twitter, for instance, we found many
plete delivery on Instagram was also less examples of the same inauthentic accounts
than 24 hours, a number of service providers being used by five or more separate MSPs.
offered significantly faster services. Our re-
cord time from purchase to delivery was less

15 ����������������������������������������������������������������������������
Responsiveness The social media
(Criteria 7) manipulation industry

After the end of phase one of the experiment, This experiment has strenghtened many of
we reported 100 random accounts used for the conclusions from our report on the ‘black
social media manipulation for each platform market’ for social media manipulation.8
and then we monitored how long it took for
the platforms to remove the accounts. It is The manipulation market is widely available,
worth reiterating that the accounts we re- and there is a large degree of reselling, which
ported were the accounts that delivered the means that different providers often use the
manipulation we bought, meaning that we same set of accounts for their manipulation
were 100% certain that these accounts were services. Some of the best MSPs are transpar-
engaging in social media manipulation. ent with the size of the underlying set of in-
authentic accounts and the quality of the ser-
Three weeks from the date we reported the vice they provide. The worst providers simply
accounts, the social media platforms had only pocket the money received without delivering
removed 4,5% of the accounts we had report- any services.
ed to them.
But even if the market is somewhat chaotic,
Given the low number of accounts removed it functions reasonably well and most orders
it is clear that social media companies are are delivered in a timely and accurate manner.
still struggling to remove accounts used for Social media manipulation remains widely
social media manipulation, even when the available, cheap, and efficient.
accounts are reported to them.
During platforms updates, the manipulation
services usually stop functioning for a few
REMOVED ACCOUNTS I
days, but so far, they have always been able to
0 out of 100 3 WEEKS AFTER
circumvent
TW new safeguards by the platforms
REPORTING
12 out of 100
andFB resume service within a week or so. It is
clear that so far the social media platforms
Y
3 out of 100 have been mostly unable to prevent the MSPs
from abusing their platforms.
3 out of 100

0 20 40 60 80 100

YouTube Facebook Twitter Instagram

��������������������������������������������������������������������������� 16
While YouTube is the worst at removing inauthentic
accounts, it is best at countering inauthentic likes and
artificial video views.

ASSESSMENT OF THE PLATFORMS


RELATIVE STRENGTHS AND WEAKNESSES
of the inauthentic likes after a month, they did
Assesment of platforms worse than Instagram by removing 0% of the
inauthentic comments. It is also noteworthy
that both Facebook and Instagram were es-
Facebook
pecially weak at countering inauthentic video
views. No inauthentic views were removed
Facebook was the platform that was most
by the platforms, and the cost of inauthentic
successful at blocking inauthentic ac-
views on the platforms is disproportionately
count creation. Facebook has sophisticated
low compared to Twitter and YouTube.
anti-automation systems built into the struc-
ture of the platform, and several MSPs strug-
Thus Facebook resembles a fortress with
gled to offer consistent services. In some
formidable defences facing the outside
cases, otherwise reliable vendors were unable
world, but qualified actors are still able to
to deliver the promised manipulation on Face-
scale the walls of Facebook; policing and
book.
oversight within the walls is far less effec-
tive.
However, vendors who were able to circum-
vent Facebook’s counter-measures had a very
high success rates. Even after many weeks,
Instagram
it was rare for any of the inauthentic interac-
Instagram was somewhat successful at
tions to have disappeared. While they did bet-
blocking account creation with roughly a 50%
ter than Instagram by removing roughly 10 %

17 ����������������������������������������������������������������������������
block rate, however it is quite easy to over- During our experiment phase Instagram
come their blocking by using relatively simple launched a significant upgrade of their plat-
techniques such as VPNs and cache control. form, which caused some of the manipulation
service providers to pause their services, but
The cost of manipulating Instagram was the within a few weeks all the MSPs had updated
lowest for all types of manipulation—likes, their system and were able to resume their
views, comments, and followers. Manipu- manipulation services.
lation service providers found Instagram to
be the easiest platform to manipulate. On av- Our experiment clearly shows that Instagram
erage, the service provided deviated by 18% has significant challenges with countering
from what we ordered and most orders were abuse on their platform as manipulating Ins-
delivered within 24 hours. tagram it is both easy and cheap.

Instagram seemed to have a flaw in their Twitter


system as their counters for likes and com-
ments did not reflect real changes—during Twitter is currently the most effective platform
our experiment the counters went up but at countering abuse of their services. It takes
never down. When inauthentic accounts longer for bought engagement to appear on
were removed there was no change in the Twitter and the quality of delivery is more un-
like and comments counters. To get accu- even than on the other platforms. Even so, all
rate recordings we had to compare the list of the MSPs delivered all the services we bought
user engagements and the counter numbers without any refusals or failed deliveries.
for each engagement. A consequence of this
flaw is that manipulating Instagram is easi- Twitter also identified and removed more ma-
er because even if your bots are blocked the nipulation than the other platforms. On aver-
effect of the manipulation will remain on the age half of the likes and retweets bought on
platform. Twitter were removed during the testing pe-
riod. At 35%, Twitter had blocked the highest
Instagram also performed poorly in block- proportion of accounts by the time we started
ing content manipulation. Instagram re- reporting the accounts. This indicates that
moved only one percent of the bought likes accounts used by MSPs are removed most
during the month-long test phase. Instagram effectively on Twitter.
also had the lowest number of blocked ac-
counts—14%—by the time we started re- Twitter failed to remove any of the bought
porting the inauthentic account to the social views and we were unable to measure the
media platforms. Instagram did better than number of comments removed because of a
Twitter by blocking 44 % of the inauthentic fol- Twitter feature that made it difficult for us to
lowers we bought, but worse than YouTube. measure the reason a comment was removed

��������������������������������������������������������������������������� 18
[the much debated “This tweet is unavailable” views, however a 10 % reduction is far from
feature]. sufficient for preventing platform abuse.
From previous experiments we have seen
While Twitter is effective at blocking new in- that inauthentic activity on YouTube can re-
authentic accounts, the legacy of their inade- main active for many months without being
quate anti-spam efforts still impacts the plat- detected, an insight this experiment seems to
form today as aged inauthentic accounts that strengthen.
were created before Twitter improved their de-
fences seem to remain active on the platform. In many ways, YouTube is the least trans-
parent platform, and it is difficult to identify
Twitter should be commended for doing the inauthentic accounts on YouTube. The popu-
most to combat malicious use of their plat- larity of the platform, the difficulty [for exter-
form. They are currently ahead of the other nal researchers] of detecting platform manip-
platforms we tested. Even so, it is still very ulation, and the potential financial rewards of
possible to manipulate content and conversa- manipulation make YouTube an ideal target.
tions on Twitter, it just requires a little more The fact that YouTube is the most expensive
effort than on the other platforms. platform to manipulate is either a testament
to its defensive actions or to the popularity of
YouTube YouTube manipulation. We currently do not
know which.
Our assessment of YouTube shows a split pic-
ture. While YouTube is the worst at removing Relative performance
inauthentic accounts, it is best at countering
inauthentic likes and artificial video views. There is a significant difference between the
YouTube’s ability to counter inauthentic com- ability of the different social media platforms
ments is twofold; while many of the manipu- to counter manipulation.
lation providers struggled to provide service,
one provider was extremely efficient in de- While Twitter outperformed the others, it is far
livering inauthentic comments on YouTube. more challenging to rank the relative perfor-
Nine out of ten comments delivered remained mance of YouTube, Facebook, and Instagram.
active on the platform throughout the experi- In our final assessment we decided to prior-
ment. itise a platform’s ability to counter manipu-
lation before other responses, which places
Countering artificial views should be the great- YouTube ahead of Facebook and Instagram.
est concern for YouTube as fake views gen- At the same time, we assess, that none of the
erate fake advertising costs for YouTube ad- four platforms are doing enough to prevent
vertisers. Based on our experiment YouTube the manipulation of their services.
is the industry leader in countering artificial

19 ����������������������������������������������������������������������������
ASSESSMENT OF THE PLATFORMS RELATIVE STRENGTHS AND WEAKNESSES
YouTube Facebook Twitter Instagram Poor Improving Good

1. Ability to block fake account creation

2. Ability to identify and remove


inauthentic accounts

3. Ability to remove fake likes, views etc.


delivered by inauthentic accounts

4. Ability to undo historic activity


made by inauthentic accounts

5. Manipulation costs (more expensive =


harder to manipulate the platform)

6. Speed of delivery (slower = harder


to manipulate the platform)

7. Speed of fake account removal


after being reported to the platform

MS
P

���������������������������������������������������������������������������
Illustration of relative performance of Twitter (1st), YouTube (2nd), Facebook (3rd) and Instagram (4th).
Manipulation service providers are still winning.
20
table below. Twitter - 60%, YouTube - 43%,
The industry’s ability Facebook - 26%, and Instagram - 1%.
to remove manipulation
Industry’s ability to remove
Overall social media companies are experi- inauthentic accounts
encing significant challenges in countering
the malicious use of their platforms. While Six weeks after we started buying inauthentic
they are improving in the field of removing social media engagement, just before we start-
inauthentic followers, they are facing sub- ed reporting the accounts used to deliver the
stantial difficulties in combating inauthentic manipulation services, we measured how many
comments and views. were still active. The results are disturbing.

Across all platforms, removal of purchased In total, just 17% of the bots we identified had
manipulations is most often first recorded on been removed. This low figure shows that the
the third to the fifth day after purchase, which social media companies’ own algorithms for
is worrying given the speed of social media. detecting misuse are ineffective across the
If manipulations are identified and removed board. The worst-performing services for
only three to four days after they are posted blocking inauthentic accounts were Insta-
on Instagram, Facebook and Twitter, the de- gram and YouTube. Facebook ranked third,
layed efforts to counter manipulation will be having removed 21%. And, according to this
less effective. measure, the least-poorly-performing service
was Twitter, which succeeded in removing
If we focus on the average reduction of manip- 35% of the accounts.
ulation per post after four weeks, then the rel-
ative performance of the different platforms These figures bear consideration: bad actors
is the same as the total decrease seen in the wishing to manipulate social media plat-

Comments Likes Followers Views

Instagram 3% 1% 44% 0%

Facebook 0% 10% n/a 0%

Twitter 1% 27% 37% 0%

YouTube 11% 30% 61% 10%


Percentage of inauthentic engagement removed after four weeks.

21 ����������������������������������������������������������������������������
forms can expect that only a small fraction Inauthentic accounts pattern of
of inauthentic activity will be proactively interaction on social media
removed. So, most malicious activity goes
unchallenged, and even if it is removed, its ef- Following the accounts used for manipulation
fects often remain. to analyse who else uses their services and
which content they are manipulating is tech-
Social media companies report that they block nically difficult due to data access limitations.
millions of inauthentic accounts annually,9 but
this does not seem to influence the ability of While we were able to examine accounts on
manipulation service providers to manipulate Twitter, on Facebook we were only able to see
their platforms. One explanation for the seem- what pages the accounts had engaged with
ingly impressive numbers reported by the so- and on Instagram we had to conduct a man-
cial media companies could be that millions ual examination of a random sample of ac-
of accounts are blocked upon creation. counts. We found no way to study this issue
on YouTube.
If malicious actors fail repeatedly before suc-
cessfully creating inauthentic accounts, the Our experiment shows that the vast majority
result is that a high number of inauthentic ac- of bought engagement is used for commer-
counts are blocked but inauthentic accounts cial purposes. Instagram seems to have the
are still eventually created. biggest problem with bought manipulation on

Instagram 14% Instagram

Facebook 21% Facebook

Instagram: 14 330 accounts identified


Twitter 35% Twitter Facebook: 2427 accounts identified
Twitter: 1982 accounts identified
Youtube: 471 accounts identified

YouTube9 11% YouTube

0 1500 3000 4500 6000 7500 9000 10500 12000 13500 15000
Blocked

Initial number

Percentage of inauthentic accounts removed after six weeks.10, 11

��������������������������������������������������������������������������� 22
commercial influencer accounts. Some of the
influencers we identified even had contracts
with major international brands and were ma-
nipulating their reach and engagement statis-
tics.

While we did identify political manipu-


lation, and as many as four of five ac-
counts used for manipulation on Facebook
had been used to engage with political
content to some extent, we assess
that more than 90% of purchased en-
gagements on social media are used for
commercial purposes.

At the same time, it should be noted that we


did identify at least one known pro-Kremlin
bot account in our pool of identified inauthen-
tic accounts. This indicates that even if po-
litical manipulation is only a minor function
of the manipulation industry, it is definite-
ly being used for this purpose as well. The
inauthentic accounts we identified had been
used to buy engagement on 721 political pag-
es and 52 government pages, including the
official accounts of two presidents, the offi-
cial page of a European political party, and a
number of junior and local politicians in Eu-
rope and the United States. The vast majority
of the political manipulation, however, was
aimed at non-western pages.

23 ����������������������������������������������������������������������������
CASE STUDY: PROTECTING VERIFIED ACCOUNTS
While most of our experiment was conducted any other accounts on social media platforms.
by buying engagement on our own accounts More specifically, we were unable to detect
we also tested if the platforms are better at any difference between the protection of veri-
protecting verified institutional accounts.10 fied and ordinary accounts on Instagram, but
for Facebook there was a difference between
To conduct this part of the experiment we the reductions in inauthentic likes. Verified ac-
bought engagement on apolitical messages counts had an average of 27% of their likes
such as New Year’s greetings on the posts of a removed after four weeks while our ordinary
few European institutional accounts. The com- accounts only lost 5%. For inauthentic com-
ments we bought were simple messages of a ments and views there was no difference on
positive nature such as ‘Hello!’ and ‘Thank you!’ Facebook.

We have chosen four posts by commissioners On Twitter, there was a clear difference be-
Jourová, Katainen and Vestager to illustrate tween bought engagement on our ordinary
this case study. The results are representa- posts versus bought engagement on verified
tive for our overall conclusion that engage- posts, but without any clear pattern. 37% of
ment is quickly delivered and remains active comments on our own posts were removed,
for a significant time period. Our examples but only 1% of inauthentic comments were
show that inauthentic activity can remain removed from the verified posts. For likes and
active for a long time. In December 2019, retweets Twitter was more effective at reduc-
30 weeks after the experiment, a significant ing inauthentic engagement on verified posts
proportion of the inauthentic content was still by roughly 20%. On YouTube we saw a 23%
online. removal of inauthentic likes on verified vid-
eos versus a 1% decrease on ordinary videos.
These four examples also show that some- However, for comments the results were re-
times manipulation service providers over versed with an 13% removal rate on ordinary
deliver, sometimes they fail, but mostly they videos compared to a 8% decrease on verified
are right on target. Most often it is the quality videos.
of the manipulation service provider, not the
platform, which determine effectiveness. So while we did identify a difference in the
ability of social media platforms to protect
We had predicted that verified accounts and verified European institutional accounts,
posts would be better protected, but based the identified difference was either random
on our case-study, this does not seem to be or insignificant. It leaves us to conclude
the case. We therefore conclude that verified that verified institutional accounts are like-
accounts, even institutional accounts, are no ly no better protected against manipulation
better protected against manipulation than on social media platforms.

��������������������������������������������������������������������������� 24
25 ����������������������������������������������������������������������������
Self-regulation is not working. The
manipulation industry is growing year by
year. We see no sign that it is becoming
substantially more expensive or more
difficult to conduct widespread social media
manipulation.

CONCLUSIONS

Since its foundation, the NATO Strategic Com- [N]ew technologies can be used,
munication Centre of Excellence has studied notably through social media, to
social media manipulation because it is an disseminate disinformation on a
important and integral part of the influence scale and with speed and precision
campaigns malicious state and non-state ac- of targeting that is unprecedented,
tors direct against the Alliance, Allied nations, creating personalised information
and Partner nations. Bolstering resilience to spheres and becoming powerful
influence campaigns is an essential part of echo chambers for disinformation
what we do. campaigns. [...] Mass online disinfor-
mation campaigns are being widely
In `tackling online disinformation: a European used by a range of domestic and for-
approach`, the European Commission rightly eign actors to sow distrust and create
noted that: societal tensions, with serious poten-
tial consequences for our security.12
[T]he exposure of citizens to large
scale disinformation, including mis- In this context, it is vitally important that
leading or outright false information, social media companies do their utmost to
is a major challenge for Europe. [...] prevent the abuse of their platforms. We de-

��������������������������������������������������������������������������� 26
signed this experiment to test the leading Most of the inauthentic accounts we moni-
social networks’ implementation of the part tored remained active throughout the exper-
of the self-regulatory Code of Practice which iment. This means that malicious activity
addresses inauthentic accounts and coordi- conducted by other actors using the same
nated inauthentic behaviour online. services and the same accounts also went
unnoticed.
Based on this experiment and several other
studies we have conducted over the last two This means that even if our experiment, which
years, we assess that Facebook, Instagram, was benign by design, did not trigger our pur-
Twitter, and YouTube are still failing to tackle chased engagement to get blocked – neither
coordinated inauthentic behaviour online. did any of the other activity performed by the
leading social media manipulation companies
Self-regulation is not working. The manipula- in Europe and Russia. We know this because
tion industry is growing year by year and we the bulk of the inauthentic accounts used to
see no signs that conducting widespread so- deliver the engagement we bought were also
cial media manipulation is becoming substan- used to deliver engagement other customers
tially more expensive or more difficult. bought – and the inauthentic accounts most-
ly stayed active during our entire test. In fact,
We have followed the reports of the social me- many are active still today.
dia companies as delivered within the frame-
work of the European self-regulatory Code of Although the fight against online disinforma-
Practice. We recognize that all the platforms tion and coordinated inauthentic behaviour is
have undertaken efforts to address coordinat- far from over, an important finding of our ex-
ed inauthentic behaviour, fake accounts, and periment is that the different platforms aren’t
malicious, bot-driven activity as well as terms equally bad—in fact, some are significantly
of service enforcement during the past year. better at identifying and removing manipula-
At the same time, it is evident that the level tive accounts and activities than others.
of transparent and independent assessments
that would enable accurate conclusions are Investment, resources, and determination
still missing. make a difference.

In contrast to the reports presented by the so-


cial media companies themselves, we offer a
different picture: We were easily able to buy
more than 54 000 inauthentic social media in-
teractions with little or no resistance. Our ex-
periment shows that social media platforms
can still be easily manipulated.

27 ����������������������������������������������������������������������������
If we can conduct an experiment, then
so can the social media companies.

POLICY RECOMMENDATIONS

Set standards and require a greater extent. Finally, a system of indepen-


reporting based on more dent auditing should be considered in order to
meaningful criteria build and maintain trust in the reports from
the social media companies.
To further evaluate the impact and extent
of inauthentic activity on social media plat- Increase transparency
forms, more granular information is needed
on the kind of inauthentic accounts blocked, If we can conduct an experiment, then so can
which kind manages to gain access to the the social media companies. We faced signif-
platforms and what impact they are having. icant challenges because we were forced to
More detailed insights are also required about collect snippets of information from the out-
detected inauthentic coordinated activity, in- side, but the companies could test their de-
cluding targets, levels of engagement, and the fences and report the results with much great-
issues exploited to manipulate public opinion. er accuracy if they chose to do so. Currently,
Furthermore, a common standard needs to they are mostly reporting good news—such
be developed so that reports from different as how successful they have been in prevent-
social media companies can be compared to ing the creation of inauthentic accounts. In
doing so, they present a picture that provides

��������������������������������������������������������������������������� 28
too little insight into how many inauthentic should regulate the market for social media
accounts eventually gain access and what manipulation.
they do on the platforms.
Social media platforms need
More transparency is needed to understand to do more to counter abuse
the scope and effect of manipulation. Me- of their services
ta-manipulation, the practice of buying en-
gagement to trigger algorithms to boost Manipulative service providers continue to ad-
posts, is especially worrying since it is very vertise and promote their services on the very
difficult for outside researchers to identify. platforms, which they seek to undermine. Provid-
ers trafficking in YouTube manipulation services
Establish independent and buy ads from Google—the owner of YouTube—
well-resourced oversight and fearlessly promote their services using both
advertisements and YouTube channels. It is far
Independent oversight could be able to pro- too easy to find and order manipulation services
vide the insight needed to better assess the on the very platforms they seek to undermine.
progress of the social media companies in
countering inauthentic activity on their plat- WhatsApp, a company owned by Facebook,
forms. Today we are in a situation where issued a stern warning in June 2019 noting
efforts to analyse, evaluate and assess so- that “[…] beginning on December 7, 2019,
cial media companies are facing a resource WhatsApp will take legal action against those
disadvantage. Data is becoming scarcer and we determine are engaged in or assisting oth-
our opportunities to research this field is con- ers in abuse that violates our Terms of Ser-
stantly shrinking. This effectively transfers vice.[…]”.13 As manipulation service providers
the ability to understand what is happening engage in the systematic abuse of social
on the platforms to social media companies. media companies it is surprising that there
Independent and well-resourced oversight is has not been more systematic efforts by the
needed. social media companies to counter the ma-
nipulation industry. The announcement by
Regulate the market for WhatsApp is an important step forward.
social media manipulation
A whole-of-industry
While we have focused a great deal on the solution is needed
ability of the social media companies to pro-
tect their platforms, it is also important that Recent studies have shown that social media
we turn our attention to the industry that manipulation and disinformation generate
profits from developing the tools and meth- significant ad-revenue.14 In fact, a recent re-
ods that enable this interference. Lawmakers port argues that inauthentic influencer mar-

29 ����������������������������������������������������������������������������
keting is a 1,3 billion dollar per year problem.15 Implications for NATO
Our research confirms that commercial ma-
nipulation is indeed the main driving force for Social media manipulation is a challenge for
social media manipulation—gaming advertis- NATO because it is an important tool for ma-
ers for profit. But the tools and methods de- licious actors conducting influence activities
veloped and funded to scam the advertising against the interests of the Alliance. Bolster-
industry are also used for political and nation- ing our collective resilience requires us to
al security interference. At the same time the understand this problem better so we can
telecommunication industry has a responsi- establish more effective procedures for anal-
bility to limit the use of sim-cards for manip- ysis, prevention, and early detection. As the
ulation services, and most manipulation ser- defences of the social media companies are
vice providers depend upon financial payment still inadequate, we must continue to expect
solutions (such as Paypal) to function well, that antagonists will be able to exploit social
and they require Internet Service Providers to media for malign purposes.
gain and maintain access to the internet. A
whole-of-industry solution is needed to com- If antagonists are able to manipulate the in-
bat this problem. formation environment, the ability of the Alli-
ance to effectively message in times of crisis
or conflict will be hampered. Therefore, the
Alliance must continue to refine its strategies
for communication in a highly contested In-
formation Environment.

Assessing the Information Environment re-


quires a further refined ability to differentiate
between real and genuine content. The find-
ings of this study should be incorporated into
the Alliance’s continued efforts to enhance its
ability to assess the Information Environment.

��������������������������������������������������������������������������� 30
ENDNOTES

1. European Commission. “Code of Practice on Disinformation.” September 26, 2018.


2. EEAS - European External Action Service - European Commission. “Progress Report on Ac-
tion Plan against Disinformation”
3. Ibid.
4. EEAS - European External Action Service - European Commission. “EU-NATO Cooperation -
Factsheets”; NATO. “Relations with the European Union”
5. NATO. “Joint declaration by the President of the European Council, the President of the Euro-
pean Commission, and the Secretary General of the North Atlantic Treaty Organization”.
6. European Commission - European Commission. “Action Plan on Disinformation: Commission
Contribution to the European Council (13-14 December 2018)”.
7. European Commission. “Code of Practice on Disinformation” September 26, 2018.
8. NATO StratCom CoE. “The Black Market for Social Media Manipulation”.
9. European Commission. “Annual Self-Assessment Reports of Signatories to the Code of Prac-
tice on Disinformation 2019”.
10. For YouTube, we are reporting the percentage of comments removed as a proxy for users.
This estimate errs on the generous side, as comments can be moderated without a user be-
ing permanently banned from the platform.
11. The composition of bought engagement, and the type of inauthentic accounts used by the
manipulation service providers, varied for each social media company.
12. European Commission. “Communication - Tackling Online Disinformation: A European Ap-
proach”.
13. WhatsApp.com. “WhatsApp FAQ - Unauthorized Use of Automated or Bulk Messaging on
WhatsApp”.
14. Global Disinformation Index, “The Quarter Billion Dollar Question: How is Disinformation
Gaming Ad Tech?”.
15. CHEQ. “Ad Fraud 2019: The Economic Cost of Bad Actors on the Internet”.

31 ����������������������������������������������������������������������������
Prepared and published by the
NATO STRATEGIC COMMUNICATIONS
CENTRE OF EXCELLENCE

The NATO Strategic Communications Centre of Excellence (NATO StratCom COE) is a NATO accredited
multi-national organisation that conducts research, publishes studies, and provides strategic commu-
nications training for government and military personnel.
Our mission is to make a positive contribution to Alliance’s understanding of strategic communi-
cations and to facilitate accurate, appropriate, and timely communication among its members as
objectives and roles emerge and evolve in the rapidly changing information environment.
Operating since 2014, we have carried out significant research enhancing NATO nations’ situational
awareness of the information environment and have contributed to exercises and trainings with
subject matter expertise.

www.stratcomcoe.org | @stratcomcoe | info@stratcomcoe.org

���������������������������������������������������������������������������
32 32

You might also like