You are on page 1of 16

686672

research-article2016
MCS0010.1177/0163443716686672Media, Culture & SocietyMejias and Vokuev

Original Article

Media, Culture & Society

Disinformation and the


2017, Vol. 39(7) 1027­–1042
© The Author(s) 2017
Reprints and permissions:
media: the case of Russia sagepub.co.uk/journalsPermissions.nav
DOI: 10.1177/0163443716686672
https://doi.org/10.1177/0163443716686672
and Ukraine journals.sagepub.com/home/mcs

Ulises A Mejias
State University of New York at Oswego, USA

Nikolai E Vokuev
Syktyvkar State University, Russia

Abstract
The ongoing conflict between Russia and Ukraine can be analyzed as an instance where
the Internet has strengthened the power of political actors to create disinformation. But
it is no longer only the state-supported media monopoly that produces and disseminates
propaganda. Citizens themselves actively participate in their own disenfranchisement by
using social media to generate, consume or distribute false information, contributing
to a new order where disinformation acquires increasing authority. This essay follows
disinformation practices through the transition from broadcast to social media in post-
Soviet times and theorizes how the coexistence of old and new media in the production
of propaganda might inform our understanding of future scenarios, including in Western
democracies.

Keywords
disinformation, Internet, propaganda, Russia, social media, Ukraine

Introduction
In the chaotic and complicated aftermath of the ephemerally named Twitter Revolutions
– including the Occupy and Arab Spring protest movements – it seems pertinent to exam-
ine not only how activists benefitted from using social media to mobilize and organize

Corresponding author:
Ulises A Mejias, State University of New York at Oswego, 222B Marano, Oswego, NY 13126, USA.
Email: ulises.mejias@oswego.edu
1028 Media, Culture & Society 39(7)

but also how these tools may have been used to undermine the protests. While arguments
that posit social media as the principal cause of revolutions are now largely dismissed as
simplistic technological determinism, questions about the complex relationship between
social media technologies and political actors on different sides of a conflict are still
worth pursuing. To that effect, this essay examines recent events in the Russia–Ukraine
conflict, in which the pro-Russian governing party in Ukraine was ousted and replaced
by a pro-European party after a period of demonstrations known as Euromaidan, fol-
lowed by the annexation of the Ukrainian territory of Crimea by the Russian Federation
in March 2014. Our main thesis is that during these events, the use of social media gener-
ally weakened the power of civil society by allowing for the rampant spread of disinfor-
mation. While repressive governments and their agents are traditionally seen as the
sources of propaganda, the Russia–Ukraine conflict suggests that social media can also
give ordinary citizens the power to generate false and inaccurate information. That such
social media campaigns can be co-opted and redistributed via mass media channels to
amplify their effect is cause to fear similar applications in other parts of the world. Thus,
the lessons we derive about how authorities in Russia and Ukraine were able to disrupt
protest movements, using the same digital media platforms activists and citizens were
using, may influence our understanding of future conflicts elsewhere, including in osten-
sibly democratic regimes.
Because concrete evidence of the authorship of disinformation is difficult to obtain,
our approach in this essay is to theorize what happens when propaganda is co-produced
by regimes and citizens and disseminated through a combination of analog and digital
channels including social media. However, we do not seek to dismiss the potential of
digital networks to facilitate protest movements. While it is easy to ridicule the mis-
placed faith on the Internet as a magical gravedigger of dictatorship, plenty of evidence
suggests that new information and communication technologies (ICT) have helped pro-
test movements by strengthening civic organizations, lowering the cost of communica-
tion, increasing the speed of mobilization, making fundraising and other forms of support
more effective and so on. Since regimes were not prepared to contend with these effects,
there is concrete evidence that social media have been effective in disrupting the status
quo in the short term (Diamond, 2012; Howard, 2010; Lysenko and Desouza, 2014). It
is, however, our concern with long-term effects, beyond particular moments of protest,
that motivates us to question whether the Internet might actually better serve the interests
of oppression, not democratization. We posit that, as with previous technologies like
radio and television, the Internet is increasingly becoming – after a brief initial moment
of radical possibilities – a conservative form of mass media (McChesney, 2014; Wu,
2010), reducing the political agency of individuals by socially alienating them.
A commander in the US military described Russia’s ongoing disinformation cam-
paign as ‘the most amazing information warfare blitzkrieg we have ever seen in the his-
tory of information warfare’ (Vandiver, 2014: para. 3). Hyperbole aside, it is important to
theorize how new forms of control are being facilitated by media platforms in which citi-
zens actively produce and share disinformation. Our approach consists in using different
types of sources (scholarly, popular and observational) to chronicle the evolution of dis-
information campaigns in the post-Soviet context. This approach allows us trace devel-
oping modes of disinformation in the transition from broadcast to participatory media in
Mejias and Vokuev 1029

Russia and Ukraine. The emerging feature of these new forms of disinformation is that it
is not only the state-controlled media organization that produces propaganda but citizens
themselves who actively participate in the creation of disinformation by using new plat-
forms to push their individual opinions to a point of excess, contributing to a new order
where disinformation acquires a certain authority. Whereas information spread by gov-
ernments or corporations can be skeptically dismissed, information produced and shared
by regular users (or what are perceived to be regular users) acquires authenticity, and
spreading this information is an act rewarded by social media platforms in terms of
increased social capital such as attention, popularity and visibility. In this context, it
might be instructive to recall what Deleuze (1997) observed of societies of control: that
an increase in opportunities for expression (in this case associated with social media)
does not necessarily mean an increase in opportunities for political empowerment:
‘Repressive forces don’t stop people expressing themselves but rather force them to
express themselves …’ (p. 129). Social media encourages this kind of self expression,
and we see at least two important outcomes of disinformation acquiring authority through
being shared online: first, activists who rely heavily on social media can develop a dis-
torted sense of popular support for their cause that can politically backfire, and second,
private ownership of the social media platforms used during protests can make it easy to
co-opt and weaken social movements (Mejias, 2013; Morozov, 2012). More importantly,
this analysis suggests that civil society can become an active participant in its own dis-
empowerment by engaging in an excess of self-interested communication through the
production and consumption of disinformation. This disembodied compulsion to express
results not in the sharing of meaning, but in its obfuscation. It can lead to mistrust, inac-
tion, nihilism, or violence and seriously threaten public discourse, as the Russia–Ukraine
conflict has exemplified.

From mass to social media in post-Soviet regimes


Russian interference in Ukraine is sometimes discussed in western media circles as an
affront to western values, as if the fall of communism had settled once and for all who
was and who wasn’t on the right side of history. This is, unfortunately, a deficient histori-
cal explanation. The fall of communism had less to do with the triumph of western values
and more to do with fatigue over continued military failures in Afghanistan, destabilizing
reforms imposed during Mikhail Gorbachev’s rule, and the re-emergence of Orthodox
religion, which communism failed to stamp out (Gray, 2015). Ideologically, Russia has
now positioned itself as a viable alternative to western civilization (Shevtsova, 2014),
and this belief is supported by a socially conservative rhetoric focused on condemning
non-traditional practices in faith, sex, education, art and culture (Lipman, 2014).
It is in the context of this war against western ideas that Russia is keen on preventing
Ukraine from becoming part of Europe. Furthermore, a shift in Ukraine’s alliances would
send a very clear message to capitalists with investments in the region about the Russian
state’s inability to protect their interests. Thus, the ousting of pro-Russian President
Viktor Yanukovych in February 2014 was seen as a clear affront to the status quo,
although it bears pointing out that the pro-European movement that led to his removal
was deeply polarizing within Ukrainian society as well. The military intervention that
1030 Media, Culture & Society 39(7)

followed, which resulted in the annexation of Crimea by the Russian Federation, launched
a new stage of conflict between the two nations.
But as important as the political ramifications of this war are, this essay is narrowly
concerned with media practices. The Russia–Ukraine conflict represents an interesting
juncture at which disinformation practices are replicated and transmuted across ‘old’
(broadcast) and ‘new’ (social) media. On both sides, the opposition has relied on ‘new’
media to disseminate their messages. But as we will discuss, those spaces are sometimes
becoming co-opted and transformed into sites of disinformation. Since this article is
about disinformation at the intersection of mass and social media, a brief history of how
new modes of propaganda have been informed by old mechanisms of media control is
necessary.

Broadcast media after the fall of the Soviet Union


The recent history of mainstream media in Russia can be characterized by three distinct
moments: a period of complete state control during Soviet times, a period of relative
openness following the collapse of the Soviet Union and a hybrid model in effect today
where media has been privatized but acts in accordance with the interests of the ruling
party, while a small marginalized independent media is allowed to exist (Etling et al.,
2010). In essence, the approach of the Kremlin has been to move away from direct pres-
sure and intimidation of journalists to a policy of redistribution of media assets to busi-
ness parties that are certain to support the agenda of the government (Lipman, 2014).
Consider the case of the television channel NTV, founded in 1993. NTV became an
important voice of opposition in the post-Soviet period by discussing alternative political
views and openly criticizing President Vladimir Putin. In 2001, however, the channel
was acquired by state-owned Gazprom Media, now the largest media-holding conglom-
erate in Russia. The political objectives of the takeover were quite clear from the begin-
ning, and the dirty tactics employed included a defamation campaign against owner
Vladimir Gusinsky. However, the takeover evoked only some small protests, and a sub-
sequent national survey found that only 4% of respondents saw it as an encroachment on
media freedom (Lipman, 2014: 181). In a few years, the channel had lost its political
edge to become a recreational medium specializing in celebrity scandals and trashy
police dramas. The protests of Winter 2011–2012, which called into question the re-
election of Putin, seemed to provide a brief opportunity for a return to critical journalism.
While NTV continued to appease the regime by broadcasting documentaries such as
‘Anatomy of a Protest 2’, which depicted the protesters as being paid by the US
Department of State, it also allowed critical voices to be heard. Within the space of a
couple of months, however, the top-brass of NTV was taken to task and some of the staff
forced to resign. The story as a whole is representative of the overall trend: popular inde-
pendent outlets are acquired by state-owned or state-friendly forces and turned into loud-
speakers for the dominant ideology, supplanting any serious discussion of sociopolitical
issues with derivative forms of entertainment.
Media control in Russia at the time just before the conflict with Ukraine had also
extended to the Internet, which was not seen as important as broadcast and print media,
but which had emerged as an area of concern particularly after the Arab Spring. While a
Mejias and Vokuev 1031

more extensive discussion of the Internet will follow, for now it should be pointed out
that redistribution of ownership in that case was supplemented with other strategies such
as overt censorship through technological means as well as an emphasis on legislative
prohibition. An example of the former are the Distributed Denial of Service (DDoS)
attacks aimed at shutting down the popular and politically vocal blogging platform
LiveJournal on 3 December 2011, the day before parliamentary elections (Soldatov and
Borogan, 2015). On the legislative front, the example of the State Duma passing a law in
2012 meant to protect children from information deemed to be harmful to their health
and development is indicative of this trend. Implementation of the law was facilitated by
a blacklist of supposedly dangerous websites. One of the criteria used to block websites
on the list has been, since the summer of 2013, representations of so-called non-
traditional sexual relationships that can be viewed by minors (Grove, 2013).

Media and the Orange Revolution in Ukraine


The politics of suppressing the opposition through media control have been less direct
in Ukraine, where what it means to be the opposition has frequently been a complex
notion. Changes in power structure resulting from independence (declared in 1991)
means that Ukrainian opposition forces have oftentimes switched places with those in
power, and these shifts are often accompanied by a reorganization of the media land-
scape. The example of Channel 5 serves to illustrate this trend well. The channel was
formed during the Orange Revolution, the protest movement started during the Winter
of 2004–2005 to challenge the results of the presidential election in which Viktor
Yanukovych (the pro-Russia candidate, generally speaking) beat his opponent, Viktor
Yushchenko (the pro-EU candidate). Yushchenko supporters claimed the election pro-
cess was marred by fraud and corruption and called for an unprecedented (some would
say illegitimate) third round of voting. A judicial procedure at the Supreme Court was
launched to legitimize the revote (Yushchenko would come to occupy the presidency,
then be replaced by Yanukovich, who would later be removed from office). Just before
the Orange Revolution, Channel 5 was formed as the first major television news chan-
nel in Ukraine. Coverage in Channel 5 reflected a critical stance against Yanukovych
and was instrumental in creating support for the third round of voting in the elections
that eventually lead to the Yushchenko victory. These days, however, Channel 5 is
owned by Petro Poroshenko, President of Ukraine since 2014, who has refused to part
with it even while serving as head of the country.
Media ownership by oligarchs (wealthy and influential businessmen) with strong
political agendas has not been conducive to freedom of speech. Under Yanukovych,
Press Freedom Index downgraded Ukraine from 89th to 126th out of 179 nations
(Leshchenko, 2014: 53). But while traditional broadcast media remains the most effec-
tive means of controlling public discourse (47% of residents in southern Ukraine watch
TV, and 44% in the east, according to Leshchenko, 2014: 56), the incursion of digital
media has disrupted media control dynamics. The landscape changed quickly and sig-
nificantly: from 2000 to 2001, Internet use in Ukraine jumped by 30%, and from 2000 to
2004 mobile phone use went from 2% to more than 29% (Lysenko and Desouza, 2014:
185). Given the ownership patterns of traditional media, digital media emerged as
1032 Media, Culture & Society 39(7)

practically the only alternative source of news, and the public flocked to these sources.
During the Euromaidan protests, for example, the number of unique visitors to the inde-
pendent online newspaper Ukrayinska Pravda went from 300,000 to 1 million per day
(Leshchenko, 2014: 55). These new platforms and the opportunities they afforded took
the authorities of Ukraine and Russia, which had been content with monitoring and con-
trolling traditional mass media (Lysenko and Desouza, 2014: 185), by surprise. However,
the shock was not long lasting, and the regimes quickly started to experiment with ways
of reasserting control.

New modes of disinformation


The way media channels remain relevant is by crafting messages that fit the expectations
and ideologies of their target audiences. During times of conflict, this means keeping
alive and capitalizing on the divisions within societies and between nations. In cases
such as the Russia–Ukraine conflict, ‘lack of trust is widespread and on occasion skill-
fully manipulated by the authorities’ (Etling et al., 2010: 7). What is new, however, is that
the rhetoric of division and ‘us versus them’ does not just flow from the media monopoly
to the audience, sponsored by state or private interests. With the advent of social media,
online discourse has become an important space for the generation and propagation of
these messages, turning regular citizens into propaganda machines capable of spreading
disinformation, paranoia and hatred. Thus, in the aftermath of the Arab Spring, it is pos-
sible to see the Russia–Ukraine conflict as marking an important reversal in how the
Internet is used during conflicts. The subversive possibilities hinted at in the narratives
of the Twitter Revolutions are giving way to mistrust and cynicism, as networks are co-
opted by state and corporate interests. The point is not that the Internet plays no role in
exposing wrongdoing and raising awareness, but that in spite of that, authorities have
figured out how to use it to their advantage, incorporating it into their arsenal of disinfor-
mation weapons.

Russia: the Twitter Revolution that never was


Russia’s efforts to control the Internet encompass a variety of strategies, technological as
well as non-technological. Since the 1990s, security agencies deploy devices known as
SORM (‘System for Operative Investigative Activities’) to ‘collect, analyze and store all
data that [is] transmitted or received on Russian networks, including calls, email, website
visits and credit card transactions’ (Lewis, 2014: para. 2; Soldatov and Borogan, 2015).
Network providers are required to install and pay for these devices, and although a court
order is necessary to collect data, providers cannot see the content of the order or know
what is being collected (regardless of logistical and legal particularities, this is not that
different from what the National Security Agency can do in the United States; see Van
der Velden, 2015). The Internet is also controlled through non-technological strategies
such as political pressure exerted on companies that do not toe the Kremlin line; armies
of paid bloggers and trolls that post content and opinions favorable to the regime (more
on this later); and through a legal framework that extends surveillance and filtering at
times of political upheaval (Soldatov and Borogan, 2015).
Mejias and Vokuev 1033

Undoubtedly, the Internet has played a role in creating a space for dissent in Russia.
But the awakening of the dissenters to the fact that – as Morozov (2012) argues – social
media can be used by both sides, not just the side one likes, was abrupt. As social media
became more popular, a new breed of disinformation campaigns emerged that effort-
lessly moved between old and new media, often referencing each other. For instance, on
July 2014, Channel One aired a story featuring a woman supposedly from Slavyansk
who decried the crucifixion of a 3-year-old boy. Ukrainian soldiers, she claimed, had
killed him in order to frighten the population of the city. No investigation corroborated
the information, and the news cycle quickly moved on. What was noteworthy, however,
was that the story was first published on Facebook by pro-Kremlin ideologue Alexander
Dugin (Danilova, 2014), and Channel One picked it up and produced its scandalous
report a few days later. This illustrates how mass media can report inaccurate informa-
tion found on social media, giving the fake news an aura of legitimization.
Frequently, the content for these disinformation campaigns is generated using appro-
priated sources. According to the Ukrainian site Stopfake.org, there are numerous cases
of attempts to legitimize disinformation through the use of photos or video taken from
another context. For instance, a 2013 photo of a war victim in Syria was used in May
2014 to serve as proof that Ukrainian soldiers had wounded a 12-year-old schoolboy in
Sloviansk, and later, as proof that the wounded boy was from Donetsk (Stopfake.org,
2014c). In the same month, a photo allegedly from Donbass depicting a crying girl sitting
near what was reported to be her murdered mother was popular in VKontakte (a Russian
social networking site launched in 2006) and Twitter. In actuality, the photo was a still
from a 2010 film co-produced by Russia and Belarus titled The Brest Fortress (Stopfake.
org, 2014d). Earlier, in March 2014, users of VKontakte and other social media actively
reposted a photo from Lviv where, according to one description, ‘some bastards were
trampling a granny’ who was going to lay flowers at the monument of Lenin. Stopfake.
org pointed out that, first, there has not been a Lenin monument in Lviv for more than
20 years and, second, that the ‘granny’ was actually a man participating (together with
the ‘bastards’) in a theatrical performance during a march against illegal immigration in
2009 (Stopfake.org, 2014b). Sometimes, the disinformation strategies rely not just on
fake materials, but on fake eyewitnesses. In May 2014, 46 pro-Russian demonstrators
died in a fire at the Trade Unions House in Odessa. An emergency doctor from the same
city, Igor Rozovskiy, reported on Facebook that Ukrainian nationalists didn’t allow him
to help the injured. This story was reposted on Facebook more than two thousand times
and was translated into several languages. In actuality, Rozovskiy’s Facebook account
was created just before the story was published, and his profile picture was that of a
dentist in Russia. The user account was later deleted (Stopfake.org, 2014a).

Troll factories
Under these circumstances, it is difficult to separate fact from fiction, and nearly impos-
sible to ascertain who is behind a particular disinformation campaign. Even the com-
ments posted in response to fake news articles might themselves be fake. The existence
of Russian armies of paid pro-government Internet trolls is roundly denied, but thor-
oughly documented (an Internet troll is a person who posts incendiary comments and
1034 Media, Culture & Society 39(7)

expresses disagreement through insults). The British newspaper The Guardian, for
example, has reported concerted attacks on some of its articles about Russia and Ukraine
of up to 40,000 comments per day (Gregory, 2014). The existence of a ‘troll factory’ situ-
ated in Olgino, outside St. Petersburg, was reported in 2013. Journalists from the Russian
newspapers Novaya Gazeta and Moi Raion even infiltrated it, posing as job seekers.
They found that the factory was referred to as the Internet Research Agency and was
supposedly started by Putin’s friend Evgeny Prigozhin. There, hundreds of paid bloggers
worked hard every day under fake identities, apparently without an employment con-
tract. Their job was to praise Putin and denounce the opposition in forums, social net-
works and the comment boards of national and international media (Garmazhapova,
2013). The conflict with Ukraine has obviously been a major battlefront for these so-
called trolls. Lyudmila Savchuk, and ex-worker at the troll factory who now runs a com-
munity in VKontakte denouncing Kremlin propaganda, told The New York Times that she
and her co-workers were encouraged to ‘post comments that disparaged the Ukrainian
President, Petro Poroshenko, and highlighted Ukrainian Army atrocities’ (Chen, 2015:
para. 12). According to The Guardian, ‘the trolls were firmly instructed that there should
never be anything bad written about the self-proclaimed Donetsk People’s Republic
(DNR) or the Luhansk People’s Republic (LNR), and never anything good about the
Ukrainian government’. (Walker, 2015: para. 20). In spite of this, the work of trolls is
generally easy to identify, mostly because of its repetitiveness. They often inundate
Twitter and other social media platforms posting the same message again and again from
many different fake accounts. Alexander (2015a) measured the scale of the network of
Russian trolls on Twitter, discovering that it consists of 2900 interconnected accounts. In
a separate study (Alexander, 2015b), he tied a whole network of anonymous websites to
the activities of the troll factory; the network included sites producing pro-Russian
memes, demotivator graphics ridiculing opponents and portraying Putin as a strong
leader, and blogs from supposedly disillusioned Euromaidan activists.
The ‘Trolls from Olgino’ (as they are known) remain an effective tool of Kremlin
disinformation, but the Internet has given authorities other ways of putting pressure on
the opposition. While it is true that social media has provided an efficient way for activ-
ists to rally supporters, it has also made it easy for authorities to identify and intimidate
dissenters. The website predatel.net has been publicly identifying and condemning oppo-
nents of the regime and making it possible for visitors to click a button to ‘suggest a
traitor’ (Dougherty, 2014). But these measures cannot stop the spread of information
completely. For instance, in late August 2014, when the Kremlin was denying the partici-
pation of Russian troops in the war in Ukraine, journalists from the regional newspaper
Pskovskaya Guberniya discovered the pages in VKontakte of dead Russian soldiers who
had died in Ukraine, and even located the graves of a couple of them (Standish, 2014).
The newspaper’s web site was quickly brought down by hackers, although the articles
could still be found on the blog of the newspaper’s publisher, Lev Shlosberg. Later, when
Pskovskaya Guberniya published recorded conversations of Russian paratroopers dis-
cussing their losses, Shlosberg’s blog was attacked as well. By then, however, the infor-
mation was spreading through reposts. It is this ever-present threat of an Internet-assisted
protest movement close to home that must have motivated politicians and functionaries
of the Federal Service for Supervision of Communications, Information Technology and
Mejias and Vokuev 1035

Mass Media (known as Roskomnadzor) to consider a general prohibition of Facebook


and Twitter (Beard, 2014). Meanwhile, President Putin floated the idea of creating a
Russian Internet isolated from the rest of the World Wide Web (Harding, 2014). The
political backlash unleashed by such drastic measures would perhaps be too extreme;
instead, the Kremlin seems to prefer exploring means of co-opting social media, not try-
ing to eliminate it. When dissent is allowed to exist online, it is because of a calculated
move on the part of the state, which sees the benefit of maintaining some valves open for
the opposition to blow steam under careful surveillance (Dougherty, 2014). As Etling
et al. (2010) observe, in Russia ‘anyone with access to the Internet can criticize the gov-
ernment and other powerful interests, but there are often consequences’ (p. 4).

Ukraine: the Twitter Revolution that worked too well


Social media played a decisively role in Ukraine’s Euromaidan movement, which fol-
lowed Yanukovich’s refusal to ratify the trade agreement with the European Union in
2013. Yes, the revolution was, to a certain extent, Tweeted. But the success with which
one side of the conflict was able to present their views also created a distorted perception
of public support. Merely a decade earlier, the first Maidan had been primarily a TV
phenomenon, complete with televised concerts featuring rock bands. In comparison, the
movement of 2013 was a different kind of protest in which digital networks shaped pub-
lic opinion much more effectively than traditional broadcast media. In fact, the role of
major TV channels was taken over by small independent companies, equipped with
modern digital equipment and broadcasting only through streaming services, which
meant they didn’t have to deal with licenses or government regulations that could attempt
to censor their content. Even if the government had had an appropriate set of anti-libel
and anti-extremist regulations for these independent outlets, it would have been imprac-
tical to try to apply them to everyone. However, it was mainly the views of one side, the
protesters, that were voiced on these new channels.
If earlier movements had taken place in the mass media as a debate between two
sides, the early days of Euromaidan unfolded in a digital public sphere as a one-sided
argument largely devoid of opposing voices. Social media drastically changed the politi-
cal landscape by undermining the side least prepared to fight a new kind of information
war, eventually forcing them out of power. But what was the aftermath of this social
media revolution? After Yanukovich was removed and the dust settled a bit, the reality of
the fact that the vast majority of the Ukrainian population was not on Facebook or
VKontakte came back to face the former protesters, now in power. The sense of unanim-
ity and popular support the opposition believed they enjoyed turned out to be, to a certain
degree, an illusion created not only by a subset of the population with access to the
Internet, but in some cases by well-orchestrated spam bot campaigns (Korrespondent.
net, 2012). The certainty with which the dissenters had claimed to represent the majority
of Ukrainians began to crumble (aided, of course, by counter-insurgency measures
funded by Russia). The effects of the filter bubble (Pariser, 2012) had been quite drastic.
The first ones to experience the pressure were the journalists who tried to express an
alternative opinion. Many of them, especially those who worked for mainstream online
media, were forced to stop reporting or change their views in favor of what was
1036 Media, Culture & Society 39(7)

perceived to be public opinion. If they refused to conform, they were labeled traitors to
the cause and inevitably lost readers and in some cases their jobs. This was not seen as
problematic because, after all, it was the public and not the state that was rejecting alter-
native voices.
This illustrates a salient paradox in Ukraine: while the post-Soviet state has bor-
rowed many of the surveillance and counter-insurgency strategies of Russia – including
the use of SORM devices – a certain degree of freedom of expression can be found,
especially online. However, as Miazhevich suggests, this freedom has a dark undercur-
rent. As in Russia, it can serve to render the existence of the opponent more tolerable
and manageable, since online speech can be countered through disinformation cam-
paigns (Miazhevich, 2015: 431). More significantly, ‘the maximum flexibility of dis-
course enabled by new media works against consolidation of civic society as it prompts
its fragmentation and virtualization’ (Miazhevich, 2015: 434, emphasis in original).
Yes, social media had allowed anyone with access to the Internet to take part in the
movement and become a virtual activist, regardless of age, gender or location. But the
perception that the majority of Ukrainian citizens supported the new government and its
pro-European agenda turned out to be a tragic miscalculation, one that many social
media users are still trying to comprehend. The gap between representation and reality,
between the virtual ideals of an avant-garde and the ideology of the unwired masses,
helped to catapult the country into civil war.

Disinformation, power and profit


During a speech on April 2014, President Putin referred to the Internet as a CIA inven-
tion. In a post-Snowden era, his comments might resonate with a larger audience. The
Internet might not have been originally intended as a tool for civilian surveillance, but
today social media certainly makes collecting intelligence and monitoring dissent a
lot easier, regardless of whether it is the United States’ National Security Agency or
Russia’s Federal Security Service doing the collecting. From the citizen’s perspective,
increased surveillance seems unavoidable – and for younger generations, perhaps
unproblematic. For example, many social media users in Ukraine are still using Russian-
owned VKontakte – 62% of those surveyed, or about twice the user base of Facebook
(Leshchenko, 2014: 55; Lokot, 2014) – even after it emerged that authorities were
monitoring and blocking pro-Ukraine social media accounts (Berkman, 2014).
The main question behind our argument is how to align theories of the Internet as an
agent of social change with theories of the Internet as an obstacle to social change, and
in light of examples from the Russia–Ukraine conflict, use these parallel explanations in
ways that might help us understand how other scenarios from around the world might
develop, including in democracies. To begin with, it should be acknowledged that the
potential of the Internet to act as an agent of social change is tied to specific historical
conditions. As Lysenko and Desouza (2014) demonstrate in their examination of cyber-
protest and counteraction in post-Soviet states during 1997–2011, if the conditions
include new media platforms that authorities are not monitoring and do not know how to
control, and if there are new players in these platforms that provide alternative informa-
tion and organizing tools (newspapers, bloggers, activist groups, etc.), then the Internet
Mejias and Vokuev 1037

is likely to play a role in social change. However, the corollary is that once the govern-
ment starts watching the new platforms and using them in counter-insurgency strategies
(including distributed disinformation campaigns), the impact of the new technologies is
reduced. In other words, the effectiveness of the Internet as an agent of change is more
pronounced when regimes are inexperienced in controlling it, but diminishes when the
regime develops strategies to surveil and manipulate it.
During that period in which tools are still new and can be applied in innovative ways
without restrictions, dissident groups are relatively successful in harnessing their power
to influence and mobilize multitudes. But as trends of media conglomeration, privatiza-
tion and deregulation continue – not just in non-democratic regimes, but in democratic
ones as well – it might soon be as outrageous to suggest that the Internet can bring about
social change as to suggest that mainstream television or radio might do so. Whatever
digital romanticism remains needs to be critically reassessed, and like those other media,
the Internet should be regarded as potentially another weapon of mass deception, allow-
ing different political actors to wield it in order to distort reality and encouraging social
media users to repost lies and hate-speech to gain a few more ‘likes’.
This transformation of the Internet is being achieved in post-Soviet states – like eve-
rywhere else – through a combination of three kinds of controls: regulatory, economic
and technological. First, as far as regulatory approaches, we can look at the work that
since February 2014 Russia’s general prosecutor and his deputies have been doing
through Roskomnadzor (the agency that regulates telecommunications) to block any
website containing dangerous content or calling for public demonstrations, all without a
court decision. In March of that year, Russian authorities blocked three opposition web-
sites – Grani.ru, Kasparov.ru and Daily Journal – as well as the blog of influential pro-
democracy anti-corruption activist Alexei Navalny (Barry, 2014). By September,
Roskomnadzor had blocked almost 2500 websites; 600 of them, the head of the agency
announced, contained ‘extremist’ content or called for unauthorized public gatherings
(Kozyrev, 2014). Other laws, meanwhile, have instituted a prohibition of swear words
that can be used to censor content (BBC News, 2014b), enforced selective anti-piracy
regulations that can shut down domains (Kozlov, 2013) and categorized bloggers with
more than 3000 readers per day as mass media channels, placing on them cumbersome
fact-checking obligations (BBC News, 2014a). Second, as far as economic means of
media control, it is obvious that there is a definite shift in Russia toward a form of state
monopoly capitalism, in which the government intervenes to form and protect certain
monopolies and block competition. The largest media assets in the country are controlled
by the state or by oligarchs who are loyal to the Kremlin. A recent bill limiting foreign
ownership, control or operation of media channels to 20% consolidates this status quo; it
is basically impossible to run a media organization if one is not on good terms with the
Kremlin. In January of 2014, for instance, VKontakte went through a severe re-organization.
Founder Pavel Durov was dismissed as CEO and forced to sell his shares of the company.
Durov had refused to block the page of Alexei Navalny and to hand over Ukrainian protest-
ers’ data to the Federal Security Service. He eventually fled the country, and VKontakte is
now in the hands of owners friendly to the regime. Finally, the third means of control is
technical. What is interesting is that apart from state initiatives like SORM (discussed
above), there are technical approaches that demonstrate direct or indirect cooperation
1038 Media, Culture & Society 39(7)

with the private sector in Russia, as well as in the west. For example, in 2014, Russian
search engine company Yandex began showing different online maps of Crimea, one
showing the peninsula as part of Ukraine, the other one as part of Russia. The idea was
to give the corresponding set of users from each nation a view of reality that matches the
perspective of their respective governments, so that the company could remain on the
good side of all parties, and thus continue to be profitable (Soldatov and Borogan, 2015:
303). But some western corporations are also participants in these tactics. For example,
Boston-based Crimson Hexagon, whose social media analytics tools are used by aca-
demics in the west to study public discourse and mobilization in Russia and Ukraine
(Etling et al., 2010), have also worked with intelligence agencies in Russia to help them
use the same tools to monitor citizens and activists (Soldatov and Borogan, 2015: 282).
One more example of technological means of control serves to illustrate potential appli-
cations beyond the Slavic region. At one point in January 2014, protesters who had con-
gregated around the Maidan in Kiev simultaneously received a text message on their
phones that read: ‘Dear subscriber, you are registered as a participant in a mass distur-
bance’ (Soldatov and Borogan, 2015: 278). Phone companies denied any involvement,
and this blunt counter-insurgency strategy, roundly ridiculed by citizens, only served to
embolden the protesters even further. The irony is that in a democratic country citizens
might actually be more intimidated by receiving similar messages, simply because per-
ceived freedom and security might make them less averse to risk.
These comparisons between post-Soviet and western contexts, while somewhat spec-
ulative, are necessary. It is easy to critique the disinformation approaches discussed
throughout this essay as examples of authoritarian attempts to control new media plat-
forms, but it is not that difficult to point to parallel tactics employed by democratic
regimes. To be sure, there is a unique set of conditions in Russia that differentiate this
case from the rest: weak rule of law, no independent judiciary, no freedom of speech or
human rights protections, telecommunications regulation in the interest of the powerful
and a propensity to silence political opposition (MacKinnon, 2012: 90). But for each of
the media control strategies in the post-Soviet context, there is an analogue or mirror
image in the western world. Both approaches share characteristics such as deregulation
of industry in a manner that gives more market power to favored corporations; increased
state power to impose special measures of surveillance during increasingly permanent
periods of emergency; a discourse of patriotism which shames dissenters and encourages
self-censorship; collaboration between government and private sector to develop and
implement technologies for surveillance; and increased secrecy about what governments
and corporations do with data collected from citizens, all in the name of security and
anti-terrorism. The media strategies that democratic states employ to surveil citizens –
strategies which frequently replicate those of non-democratic regimes – have been docu-
mented (Howard and Hussain, 2013). Furthermore, we know that surveillance is a
profitable emerging global industry, with democracies and non-democracies alike spend-
ing US$178 billion in 2010, and a projected US$2.7 trillion over the next decade (Hayes,
2012). A 2010 report indicated that in the United States alone there were 1931 private
firms doing classified work for 1271 government organizations (Hayes, 2012). Other
investigative reports suggest that agencies such as the FBI have possibly committed ‘tens
of thousands’ of legal violations while monitoring citizens (EFF, 2011).
Mejias and Vokuev 1039

In the end, the political differences between democratic or authoritarian regimes


might not matter as much as the fact that they both participate in a media surveillance
industry ripe with abuse. This is blatantly evident in the fact that, regardless of the out-
come of a protest movement, the multinational companies that own the platforms used
for protest stand to profit. For instance, after a year of conflict, and encouraged by the
‘tremendous growth’ in the region, Twitter announced that it was ready to start offering
advertising services in Ukraine. ‘As part of this expansion, Ukrainian brands and adver-
tisers will gain access to Twitter’s direct sales support teams and seller partnerships’
(Booton, 2014: para. 1). As Philip Mirowski (2013) observes when speaking about pro-
test movements that rely on corporate social media to conduct their organizing: ‘You get
to express yourself; they get to make money’ (p. 331).
Of course, corporate ownership of media channels and the pressures that advertising
places on content were two key characteristic of the traditional propaganda model devel-
oped by Herman and Chomsky (1998). Although the model has been critiqued and
updated (Sparks, 2007), social media seems to be fundamentally transforming the rela-
tionship between states, media organizations, and audiences when it comes to propa-
ganda. Disinformation seem to be less the result of message manipulation by elite media
owners, and more of a byproduct of harvesting (via social media) and directly reporting
(to the detriment of the job of the journalist) the opinions of ‘the people’. In this way,
states can rely on citizens’ do-it-yourself disinformation campaigns to maintain the status
quo. Worryingly, these media practices are not just a feature of autocratic regimes, but an
emerging characteristic in democracies as well.

Funding
The author(s) received no financial support for the research, authorship, and/or publication of this
article.

References
Alexander L (2015a) Open-source information reveals Pro-Kremlin web campaign. Global
Voices. Available at: https://globalvoices.org/2015/07/13/open-source-information-reveals-
pro-kremlin-web-campaign/ (accessed 19 February 2016).
Alexander L (2015b) Social network analysis reveals full scale of Kremlin’s Twitter bot campaign.
Global Voices. Available at: https://globalvoices.org/2015/04/02/analyzing-kremlin-twitter-
bots/ (accessed 19 February 2016).
Barry E (2014) Russia blocks web content amid tension over Ukraine. The New York Times, 13
March. Available at: http://www.nytimes.com/2014/03/14/world/europe/russia-blocks-web-
content-amid-tension-over-ukraine.html (accessed 14 April 2015).
BBC News (2014a) Russia enacts ‘draconian’ law for bloggers and online media. BBC News,
1 August. Available at: http://www.bbc.com/news/technology-28583669 (accessed 14 April
2015).
BBC News (2014b) Russian law bans swearing in arts and media. BBC News, 5th May. Available
at: http://www.bbc.com/news/world-europe-27286742 (accessed 14 April 2015).
Beard N (2014) Facebook and Twitter ‘ready for complete block’ after refusing further censor-
ship, Rain TV reports. The Calvert Journal. Available at: http://calvertjournal.com/news/
show/3482/facebook-and-twitter-ready-for-complete-block-after-refusing-further-censor
(accessed 14 April 2015).
1040 Media, Culture & Society 39(7)

Berkman F (2014) Russia blocks Pro-Ukraine groups on social media. Mashable. Available at:
http://mashable.com/2014/03/03/russia-ukraine-internet/ (accessed 14 April 2015).
Booton J (2014) Twitter to offer advertising services in Ukraine. MarketWatch. Available at: http://
www.marketwatch.com/story/twitter-to-offer-advertising-services-in-ukraine-2014–;08–26
(accessed 14 April 2015).
Chen A (2015) The agency. The New York Times, 2 June. Available at: http://www.nytimes.
com/2015/06/07/magazine/the-agency.html (accessed 19 February 2016).
Danilova M (2014) Truth and the Russian media. Columbia Journalism Review. Available at: http://
www.cjr.org/behind_the_news/truth_and_russian_media.php (accessed 14 April 2015).
Deleuze G (1997) Negotiations 1972–1990. New York: Columbia University Press.
Diamond L (2012) Liberation technology. In: Diamond L and Plattner MF (eds) Liberation
Technology: Social Media and the Struggle for Democracy. Baltimore, MD: Johns Hopkins
University Press, pp. 3–17.
Dougherty J (2014) Everyone Lies: The Ukraine Conflict and Russia’s Media Transformation.
Cambridge, MA: Shorenstein Center on Media, Politics and Public Policy, John F. Kennedy
School of Government, Harvard University. Available at: http://shorensteincenter.org/every-
one-lies-ukraine-conflict-russias-media-transformation/ (accessed 14 April 2015).
EFF (2011) Patterns of misconduct: FBI intelligence violations from 2001-2008. Electronic
Frontier Foundation. Available at: https://www.eff.org/wp/patterns-misconduct-fbi-intelli-
gence-violations (accessed 7 June 2016).
Etling B, Faris R, Palfrey J, et al. (2010) Public Discourse in the Russian Blogosphere: Mapping
RuNet Politics and Mobilization. Cambridge, MA: Berkman Center for Internet & Society
at Harvard University. Available at: https://cyber.law.harvard.edu/publications/2010/Public_
Discourse_Russian_Blogosphere (accessed 9 October 2015).
Garmazhapova A (2013) Где живут тролли. И кто их кормит [Where the trolls live. And who
feeds them.]. Novaya Gazeta. Available at: http://www.novayagazeta.ru/politics/59889.html
(accessed 19 February 2016).
Gray J (2015) Under western eyes. Harper’s Magazine, January, pp. 11–16.
Gregory PR (2014) Putin’s new weapon in the Ukraine propaganda war: internet trolls. Forbes.
Available at: http://www.forbes.com/sites/paulroderickgregory/2014/12/09/putins-new-weapon-
in-the-ukraine-propaganda-war-internet-trolls/ (accessed 14 April 2015).
Grove T (2013) Russia passes anti-gay law, activists detained. Reuters, 11 June. Available at:
http://www.reuters.com/article/us-russia-gay-idUSBRE95A0GE20130611 (accessed 1 March
2016).
Harding L (2014) Putin considers plan to unplug Russia from the internet ‘in an emergency. The
Guardian, 19 September. Available at: http://www.theguardian.com/world/2014/sep/19/
vladimir-putin-plan-unplug-russia-internet-emergency-kremlin-moscow (accessed 14 April
2015).
Hayes B (2012) The surveillance-industrial complex. In: Ball K, Haggerty K and Lyon D (eds)
Routledge Handbook of Surveillance Studies. New York: Routledge, pp. 167–175.
Herman ES and Chomsky N (1998) Manufacturing Consent: The Political Economy of the Mass
Media. New York: Pantheon.
Howard P (2010) The Digital Origins of Dictatorship and Democracy: Information Technology
and Political Islam, 1st edn. New York: Oxford University Press.
Howard PN and Hussain MM (2013) Democracy’s Fourth Wave?: Digital Media and the Arab
Spring. New York: Oxford University Press.
Korrespondent.net (2012) In Ukraine robot commentators manipulated discussions on the
Internet [В Украине проплаченные комментаторы манипулируют дискуссиями в
интернете]. Korrespondent.net, 24 September. Available at: http://korrespondent.net/ukraine/
Mejias and Vokuev 1041

politics/1398610-v-ukraine-proplachennye-kommentatory-manipuliruyut-diskussiyami-v-
internete-freedom-house (accessed 19 April 2015).
Kozlov V (2013) Russia amends anti-piracy law to specify procedure for blocking illegal con-
tent. The Hollywood Reporter. Available at: http://www.hollywoodreporter.com/news/russia-
amendments-anti-piracy-law-661475 (accessed 14 April 2015).
Kozyrev M (2014) Цифра дня: сколько экстремистских сайтов заблокировал Роскомнадзор
[Figure of the day: how many extremist websites were blocked by Roskomnadzor]. Apparat.
Available at: http://apparat.cc/news/rcn-vs-extreme/ (accessed 14 April 2015).
Leshchenko S (2014) The media’s role. Journal of Democracy 25(3): 52–57.
Lewis JA (2014) Reference Note on Russian Communications Surveillance. Washington, DC:
Center for Strategic and International Studies. Available at: http://csis.org/publication/refer-
ence-note-russian-communications-surveillance (accessed 22 October 2015).
Lipman M (2014) Russia’s nongovernmental media under assault. Demokratizatsiya 22(2):
179–190.
Lokot T (2014) Russian social networks dominate in Ukraine despite information war. Global
Voices. Available at: http://globalvoicesonline.org/2014/09/01/ukraine-russia-social-net-
works-information-war/ (accessed 14 April 2015).
Lysenko VV and Desouza KC (2014) Charting the coevolution of cyberprotest and counteraction:
the case of former Soviet Union states from 1997 to 2011. Convergence: The Journal of
Research into New Media Technologies 20(2): 176–200.
McChesney RW (2014) Digital Disconnect. New York: The New Press.
MacKinnon R (2012) China’s ‘networked authoritarianism’. In: Diamond L and Plattner MF (eds)
Liberation Technology: Social Media and the Struggle for Democracy. Baltimore, MD: Johns
Hopkins University Press, pp. 78–92.
Mejias UA (2013) Off the Network: Disrupting the Digital World. Minneapolis, MN: University
of Minnesota Press.
Miazhevich G (2015) Sites of subversion: online political satire in two post-Soviet states. Media,
Culture & Society 37(3): 422–439.
Mirowski P (2013) Never Let a Serious Crisis Go to Waste: How Neoliberalism Survived the
Financial Meltdown, 1st edn. London: Verso Books.
Morozov E (2012) The Net Delusion: The Dark Side of Internet Freedom. New York:
PublicAffairs.
Pariser E (2012) The Filter Bubble: How the New Personalized Web Is Changing What We Read
and How We Think. New York: Penguin Books.
Shevtsova L (2014) The Russia factor. Journal of Democracy 25(3): 74–82.
Soldatov A and Borogan I (2015) The Red Web: The Struggle between Russia’s Digital Dictators
and the New Online Revolutionaries. New York: PublicAffairs.
Sparks C (2007) Extending and refining the propaganda model. Westminster Papers in
Communication and Culture 4(2): 68–84.
Standish R (2014) Is Vladimir Putin covering up the deaths of Russian soldiers in Ukraine. Foreign
Policy. Available at: http://foreignpolicy.com/2014/09/01/is-vladimir-putin-covering-up-the-
deaths-of-russian-soldiers-in-ukraine/ (accessed 14 April 2015).
Stopfake.org (2014a) Fake: nationalists prevented paramedic from saving a wounded. Available at:
http://www.stopfake.org/en/fake-nationalists-prevented-paramedic-from-saving-a-wounded/
(accessed 19 February 2016).
Stopfake.org (2014b) FAKE: neo-fascists in Lviv were beating up an elderly lady, who was going
to put flowers on Lenin’s monument. Available at: http://www.stopfake.org/en/fake-neo-
fascists-in-lviv-were-beating-up-an-elderly-lady-who-was-going-to-put-flowers-on-lenin-s-
monument/ (accessed 19 February 2016).
1042 Media, Culture & Society 39(7)

Stopfake.org (2014c) Fake photos appeared on the Internet, of children who were supposedly
killed in Eastern Ukraine. Available at: http://www.stopfake.org/en/fake-photos-appeared-
on-the-internet-of-children-who-were-supposedly-killed-in-eastern-ukraine/ (accessed 19
February 2016).
Stopfake.org (2014d) Snapshot of movie The Brest Fortress is being presented as a photo of
Donbass. Available at: http://www.stopfake.org/en/snapshot-of-movie-the-brest-fortress-is-
being-presented-as-a-photo-of-donbass/ (accessed 19 February 2016).
Van der Velden L (2015) Leaky apps and data shots: technologies of leakage and insertion in
NSA-surveillance. Surveillance & Society 13(2): 182–196.
Vandiver J (2014) SACEUR: allies must prepare for Russia ‘hybrid war’. Stars and Stripes.
Available at: http://www.stripes.com/news/saceur-allies-must-prepare-for-russia-hybrid-
war-1.301464 (accessed 9 October 2015).
Walker S (2015) Salutin’ Putin: inside a Russian troll house. The Guardian, 2 April. Available
at: http://www.theguardian.com/world/2015/apr/02/putin-kremlin-inside-russian-troll-house
(accessed 19 February 2016).
Wu T (2010) The Master Switch: The Rise and Fall of Information Empires. New York: Alfred
A. Knopf.

You might also like