Professional Documents
Culture Documents
research-article2016
MCS0010.1177/0163443716686672Media, Culture & SocietyMejias and Vokuev
Original Article
Ulises A Mejias
State University of New York at Oswego, USA
Nikolai E Vokuev
Syktyvkar State University, Russia
Abstract
The ongoing conflict between Russia and Ukraine can be analyzed as an instance where
the Internet has strengthened the power of political actors to create disinformation. But
it is no longer only the state-supported media monopoly that produces and disseminates
propaganda. Citizens themselves actively participate in their own disenfranchisement by
using social media to generate, consume or distribute false information, contributing
to a new order where disinformation acquires increasing authority. This essay follows
disinformation practices through the transition from broadcast to social media in post-
Soviet times and theorizes how the coexistence of old and new media in the production
of propaganda might inform our understanding of future scenarios, including in Western
democracies.
Keywords
disinformation, Internet, propaganda, Russia, social media, Ukraine
Introduction
In the chaotic and complicated aftermath of the ephemerally named Twitter Revolutions
– including the Occupy and Arab Spring protest movements – it seems pertinent to exam-
ine not only how activists benefitted from using social media to mobilize and organize
Corresponding author:
Ulises A Mejias, State University of New York at Oswego, 222B Marano, Oswego, NY 13126, USA.
Email: ulises.mejias@oswego.edu
1028 Media, Culture & Society 39(7)
but also how these tools may have been used to undermine the protests. While arguments
that posit social media as the principal cause of revolutions are now largely dismissed as
simplistic technological determinism, questions about the complex relationship between
social media technologies and political actors on different sides of a conflict are still
worth pursuing. To that effect, this essay examines recent events in the Russia–Ukraine
conflict, in which the pro-Russian governing party in Ukraine was ousted and replaced
by a pro-European party after a period of demonstrations known as Euromaidan, fol-
lowed by the annexation of the Ukrainian territory of Crimea by the Russian Federation
in March 2014. Our main thesis is that during these events, the use of social media gener-
ally weakened the power of civil society by allowing for the rampant spread of disinfor-
mation. While repressive governments and their agents are traditionally seen as the
sources of propaganda, the Russia–Ukraine conflict suggests that social media can also
give ordinary citizens the power to generate false and inaccurate information. That such
social media campaigns can be co-opted and redistributed via mass media channels to
amplify their effect is cause to fear similar applications in other parts of the world. Thus,
the lessons we derive about how authorities in Russia and Ukraine were able to disrupt
protest movements, using the same digital media platforms activists and citizens were
using, may influence our understanding of future conflicts elsewhere, including in osten-
sibly democratic regimes.
Because concrete evidence of the authorship of disinformation is difficult to obtain,
our approach in this essay is to theorize what happens when propaganda is co-produced
by regimes and citizens and disseminated through a combination of analog and digital
channels including social media. However, we do not seek to dismiss the potential of
digital networks to facilitate protest movements. While it is easy to ridicule the mis-
placed faith on the Internet as a magical gravedigger of dictatorship, plenty of evidence
suggests that new information and communication technologies (ICT) have helped pro-
test movements by strengthening civic organizations, lowering the cost of communica-
tion, increasing the speed of mobilization, making fundraising and other forms of support
more effective and so on. Since regimes were not prepared to contend with these effects,
there is concrete evidence that social media have been effective in disrupting the status
quo in the short term (Diamond, 2012; Howard, 2010; Lysenko and Desouza, 2014). It
is, however, our concern with long-term effects, beyond particular moments of protest,
that motivates us to question whether the Internet might actually better serve the interests
of oppression, not democratization. We posit that, as with previous technologies like
radio and television, the Internet is increasingly becoming – after a brief initial moment
of radical possibilities – a conservative form of mass media (McChesney, 2014; Wu,
2010), reducing the political agency of individuals by socially alienating them.
A commander in the US military described Russia’s ongoing disinformation cam-
paign as ‘the most amazing information warfare blitzkrieg we have ever seen in the his-
tory of information warfare’ (Vandiver, 2014: para. 3). Hyperbole aside, it is important to
theorize how new forms of control are being facilitated by media platforms in which citi-
zens actively produce and share disinformation. Our approach consists in using different
types of sources (scholarly, popular and observational) to chronicle the evolution of dis-
information campaigns in the post-Soviet context. This approach allows us trace devel-
oping modes of disinformation in the transition from broadcast to participatory media in
Mejias and Vokuev 1029
Russia and Ukraine. The emerging feature of these new forms of disinformation is that it
is not only the state-controlled media organization that produces propaganda but citizens
themselves who actively participate in the creation of disinformation by using new plat-
forms to push their individual opinions to a point of excess, contributing to a new order
where disinformation acquires a certain authority. Whereas information spread by gov-
ernments or corporations can be skeptically dismissed, information produced and shared
by regular users (or what are perceived to be regular users) acquires authenticity, and
spreading this information is an act rewarded by social media platforms in terms of
increased social capital such as attention, popularity and visibility. In this context, it
might be instructive to recall what Deleuze (1997) observed of societies of control: that
an increase in opportunities for expression (in this case associated with social media)
does not necessarily mean an increase in opportunities for political empowerment:
‘Repressive forces don’t stop people expressing themselves but rather force them to
express themselves …’ (p. 129). Social media encourages this kind of self expression,
and we see at least two important outcomes of disinformation acquiring authority through
being shared online: first, activists who rely heavily on social media can develop a dis-
torted sense of popular support for their cause that can politically backfire, and second,
private ownership of the social media platforms used during protests can make it easy to
co-opt and weaken social movements (Mejias, 2013; Morozov, 2012). More importantly,
this analysis suggests that civil society can become an active participant in its own dis-
empowerment by engaging in an excess of self-interested communication through the
production and consumption of disinformation. This disembodied compulsion to express
results not in the sharing of meaning, but in its obfuscation. It can lead to mistrust, inac-
tion, nihilism, or violence and seriously threaten public discourse, as the Russia–Ukraine
conflict has exemplified.
followed, which resulted in the annexation of Crimea by the Russian Federation, launched
a new stage of conflict between the two nations.
But as important as the political ramifications of this war are, this essay is narrowly
concerned with media practices. The Russia–Ukraine conflict represents an interesting
juncture at which disinformation practices are replicated and transmuted across ‘old’
(broadcast) and ‘new’ (social) media. On both sides, the opposition has relied on ‘new’
media to disseminate their messages. But as we will discuss, those spaces are sometimes
becoming co-opted and transformed into sites of disinformation. Since this article is
about disinformation at the intersection of mass and social media, a brief history of how
new modes of propaganda have been informed by old mechanisms of media control is
necessary.
more extensive discussion of the Internet will follow, for now it should be pointed out
that redistribution of ownership in that case was supplemented with other strategies such
as overt censorship through technological means as well as an emphasis on legislative
prohibition. An example of the former are the Distributed Denial of Service (DDoS)
attacks aimed at shutting down the popular and politically vocal blogging platform
LiveJournal on 3 December 2011, the day before parliamentary elections (Soldatov and
Borogan, 2015). On the legislative front, the example of the State Duma passing a law in
2012 meant to protect children from information deemed to be harmful to their health
and development is indicative of this trend. Implementation of the law was facilitated by
a blacklist of supposedly dangerous websites. One of the criteria used to block websites
on the list has been, since the summer of 2013, representations of so-called non-
traditional sexual relationships that can be viewed by minors (Grove, 2013).
practically the only alternative source of news, and the public flocked to these sources.
During the Euromaidan protests, for example, the number of unique visitors to the inde-
pendent online newspaper Ukrayinska Pravda went from 300,000 to 1 million per day
(Leshchenko, 2014: 55). These new platforms and the opportunities they afforded took
the authorities of Ukraine and Russia, which had been content with monitoring and con-
trolling traditional mass media (Lysenko and Desouza, 2014: 185), by surprise. However,
the shock was not long lasting, and the regimes quickly started to experiment with ways
of reasserting control.
Undoubtedly, the Internet has played a role in creating a space for dissent in Russia.
But the awakening of the dissenters to the fact that – as Morozov (2012) argues – social
media can be used by both sides, not just the side one likes, was abrupt. As social media
became more popular, a new breed of disinformation campaigns emerged that effort-
lessly moved between old and new media, often referencing each other. For instance, on
July 2014, Channel One aired a story featuring a woman supposedly from Slavyansk
who decried the crucifixion of a 3-year-old boy. Ukrainian soldiers, she claimed, had
killed him in order to frighten the population of the city. No investigation corroborated
the information, and the news cycle quickly moved on. What was noteworthy, however,
was that the story was first published on Facebook by pro-Kremlin ideologue Alexander
Dugin (Danilova, 2014), and Channel One picked it up and produced its scandalous
report a few days later. This illustrates how mass media can report inaccurate informa-
tion found on social media, giving the fake news an aura of legitimization.
Frequently, the content for these disinformation campaigns is generated using appro-
priated sources. According to the Ukrainian site Stopfake.org, there are numerous cases
of attempts to legitimize disinformation through the use of photos or video taken from
another context. For instance, a 2013 photo of a war victim in Syria was used in May
2014 to serve as proof that Ukrainian soldiers had wounded a 12-year-old schoolboy in
Sloviansk, and later, as proof that the wounded boy was from Donetsk (Stopfake.org,
2014c). In the same month, a photo allegedly from Donbass depicting a crying girl sitting
near what was reported to be her murdered mother was popular in VKontakte (a Russian
social networking site launched in 2006) and Twitter. In actuality, the photo was a still
from a 2010 film co-produced by Russia and Belarus titled The Brest Fortress (Stopfake.
org, 2014d). Earlier, in March 2014, users of VKontakte and other social media actively
reposted a photo from Lviv where, according to one description, ‘some bastards were
trampling a granny’ who was going to lay flowers at the monument of Lenin. Stopfake.
org pointed out that, first, there has not been a Lenin monument in Lviv for more than
20 years and, second, that the ‘granny’ was actually a man participating (together with
the ‘bastards’) in a theatrical performance during a march against illegal immigration in
2009 (Stopfake.org, 2014b). Sometimes, the disinformation strategies rely not just on
fake materials, but on fake eyewitnesses. In May 2014, 46 pro-Russian demonstrators
died in a fire at the Trade Unions House in Odessa. An emergency doctor from the same
city, Igor Rozovskiy, reported on Facebook that Ukrainian nationalists didn’t allow him
to help the injured. This story was reposted on Facebook more than two thousand times
and was translated into several languages. In actuality, Rozovskiy’s Facebook account
was created just before the story was published, and his profile picture was that of a
dentist in Russia. The user account was later deleted (Stopfake.org, 2014a).
Troll factories
Under these circumstances, it is difficult to separate fact from fiction, and nearly impos-
sible to ascertain who is behind a particular disinformation campaign. Even the com-
ments posted in response to fake news articles might themselves be fake. The existence
of Russian armies of paid pro-government Internet trolls is roundly denied, but thor-
oughly documented (an Internet troll is a person who posts incendiary comments and
1034 Media, Culture & Society 39(7)
expresses disagreement through insults). The British newspaper The Guardian, for
example, has reported concerted attacks on some of its articles about Russia and Ukraine
of up to 40,000 comments per day (Gregory, 2014). The existence of a ‘troll factory’ situ-
ated in Olgino, outside St. Petersburg, was reported in 2013. Journalists from the Russian
newspapers Novaya Gazeta and Moi Raion even infiltrated it, posing as job seekers.
They found that the factory was referred to as the Internet Research Agency and was
supposedly started by Putin’s friend Evgeny Prigozhin. There, hundreds of paid bloggers
worked hard every day under fake identities, apparently without an employment con-
tract. Their job was to praise Putin and denounce the opposition in forums, social net-
works and the comment boards of national and international media (Garmazhapova,
2013). The conflict with Ukraine has obviously been a major battlefront for these so-
called trolls. Lyudmila Savchuk, and ex-worker at the troll factory who now runs a com-
munity in VKontakte denouncing Kremlin propaganda, told The New York Times that she
and her co-workers were encouraged to ‘post comments that disparaged the Ukrainian
President, Petro Poroshenko, and highlighted Ukrainian Army atrocities’ (Chen, 2015:
para. 12). According to The Guardian, ‘the trolls were firmly instructed that there should
never be anything bad written about the self-proclaimed Donetsk People’s Republic
(DNR) or the Luhansk People’s Republic (LNR), and never anything good about the
Ukrainian government’. (Walker, 2015: para. 20). In spite of this, the work of trolls is
generally easy to identify, mostly because of its repetitiveness. They often inundate
Twitter and other social media platforms posting the same message again and again from
many different fake accounts. Alexander (2015a) measured the scale of the network of
Russian trolls on Twitter, discovering that it consists of 2900 interconnected accounts. In
a separate study (Alexander, 2015b), he tied a whole network of anonymous websites to
the activities of the troll factory; the network included sites producing pro-Russian
memes, demotivator graphics ridiculing opponents and portraying Putin as a strong
leader, and blogs from supposedly disillusioned Euromaidan activists.
The ‘Trolls from Olgino’ (as they are known) remain an effective tool of Kremlin
disinformation, but the Internet has given authorities other ways of putting pressure on
the opposition. While it is true that social media has provided an efficient way for activ-
ists to rally supporters, it has also made it easy for authorities to identify and intimidate
dissenters. The website predatel.net has been publicly identifying and condemning oppo-
nents of the regime and making it possible for visitors to click a button to ‘suggest a
traitor’ (Dougherty, 2014). But these measures cannot stop the spread of information
completely. For instance, in late August 2014, when the Kremlin was denying the partici-
pation of Russian troops in the war in Ukraine, journalists from the regional newspaper
Pskovskaya Guberniya discovered the pages in VKontakte of dead Russian soldiers who
had died in Ukraine, and even located the graves of a couple of them (Standish, 2014).
The newspaper’s web site was quickly brought down by hackers, although the articles
could still be found on the blog of the newspaper’s publisher, Lev Shlosberg. Later, when
Pskovskaya Guberniya published recorded conversations of Russian paratroopers dis-
cussing their losses, Shlosberg’s blog was attacked as well. By then, however, the infor-
mation was spreading through reposts. It is this ever-present threat of an Internet-assisted
protest movement close to home that must have motivated politicians and functionaries
of the Federal Service for Supervision of Communications, Information Technology and
Mejias and Vokuev 1035
perceived to be public opinion. If they refused to conform, they were labeled traitors to
the cause and inevitably lost readers and in some cases their jobs. This was not seen as
problematic because, after all, it was the public and not the state that was rejecting alter-
native voices.
This illustrates a salient paradox in Ukraine: while the post-Soviet state has bor-
rowed many of the surveillance and counter-insurgency strategies of Russia – including
the use of SORM devices – a certain degree of freedom of expression can be found,
especially online. However, as Miazhevich suggests, this freedom has a dark undercur-
rent. As in Russia, it can serve to render the existence of the opponent more tolerable
and manageable, since online speech can be countered through disinformation cam-
paigns (Miazhevich, 2015: 431). More significantly, ‘the maximum flexibility of dis-
course enabled by new media works against consolidation of civic society as it prompts
its fragmentation and virtualization’ (Miazhevich, 2015: 434, emphasis in original).
Yes, social media had allowed anyone with access to the Internet to take part in the
movement and become a virtual activist, regardless of age, gender or location. But the
perception that the majority of Ukrainian citizens supported the new government and its
pro-European agenda turned out to be a tragic miscalculation, one that many social
media users are still trying to comprehend. The gap between representation and reality,
between the virtual ideals of an avant-garde and the ideology of the unwired masses,
helped to catapult the country into civil war.
is likely to play a role in social change. However, the corollary is that once the govern-
ment starts watching the new platforms and using them in counter-insurgency strategies
(including distributed disinformation campaigns), the impact of the new technologies is
reduced. In other words, the effectiveness of the Internet as an agent of change is more
pronounced when regimes are inexperienced in controlling it, but diminishes when the
regime develops strategies to surveil and manipulate it.
During that period in which tools are still new and can be applied in innovative ways
without restrictions, dissident groups are relatively successful in harnessing their power
to influence and mobilize multitudes. But as trends of media conglomeration, privatiza-
tion and deregulation continue – not just in non-democratic regimes, but in democratic
ones as well – it might soon be as outrageous to suggest that the Internet can bring about
social change as to suggest that mainstream television or radio might do so. Whatever
digital romanticism remains needs to be critically reassessed, and like those other media,
the Internet should be regarded as potentially another weapon of mass deception, allow-
ing different political actors to wield it in order to distort reality and encouraging social
media users to repost lies and hate-speech to gain a few more ‘likes’.
This transformation of the Internet is being achieved in post-Soviet states – like eve-
rywhere else – through a combination of three kinds of controls: regulatory, economic
and technological. First, as far as regulatory approaches, we can look at the work that
since February 2014 Russia’s general prosecutor and his deputies have been doing
through Roskomnadzor (the agency that regulates telecommunications) to block any
website containing dangerous content or calling for public demonstrations, all without a
court decision. In March of that year, Russian authorities blocked three opposition web-
sites – Grani.ru, Kasparov.ru and Daily Journal – as well as the blog of influential pro-
democracy anti-corruption activist Alexei Navalny (Barry, 2014). By September,
Roskomnadzor had blocked almost 2500 websites; 600 of them, the head of the agency
announced, contained ‘extremist’ content or called for unauthorized public gatherings
(Kozyrev, 2014). Other laws, meanwhile, have instituted a prohibition of swear words
that can be used to censor content (BBC News, 2014b), enforced selective anti-piracy
regulations that can shut down domains (Kozlov, 2013) and categorized bloggers with
more than 3000 readers per day as mass media channels, placing on them cumbersome
fact-checking obligations (BBC News, 2014a). Second, as far as economic means of
media control, it is obvious that there is a definite shift in Russia toward a form of state
monopoly capitalism, in which the government intervenes to form and protect certain
monopolies and block competition. The largest media assets in the country are controlled
by the state or by oligarchs who are loyal to the Kremlin. A recent bill limiting foreign
ownership, control or operation of media channels to 20% consolidates this status quo; it
is basically impossible to run a media organization if one is not on good terms with the
Kremlin. In January of 2014, for instance, VKontakte went through a severe re-organization.
Founder Pavel Durov was dismissed as CEO and forced to sell his shares of the company.
Durov had refused to block the page of Alexei Navalny and to hand over Ukrainian protest-
ers’ data to the Federal Security Service. He eventually fled the country, and VKontakte is
now in the hands of owners friendly to the regime. Finally, the third means of control is
technical. What is interesting is that apart from state initiatives like SORM (discussed
above), there are technical approaches that demonstrate direct or indirect cooperation
1038 Media, Culture & Society 39(7)
with the private sector in Russia, as well as in the west. For example, in 2014, Russian
search engine company Yandex began showing different online maps of Crimea, one
showing the peninsula as part of Ukraine, the other one as part of Russia. The idea was
to give the corresponding set of users from each nation a view of reality that matches the
perspective of their respective governments, so that the company could remain on the
good side of all parties, and thus continue to be profitable (Soldatov and Borogan, 2015:
303). But some western corporations are also participants in these tactics. For example,
Boston-based Crimson Hexagon, whose social media analytics tools are used by aca-
demics in the west to study public discourse and mobilization in Russia and Ukraine
(Etling et al., 2010), have also worked with intelligence agencies in Russia to help them
use the same tools to monitor citizens and activists (Soldatov and Borogan, 2015: 282).
One more example of technological means of control serves to illustrate potential appli-
cations beyond the Slavic region. At one point in January 2014, protesters who had con-
gregated around the Maidan in Kiev simultaneously received a text message on their
phones that read: ‘Dear subscriber, you are registered as a participant in a mass distur-
bance’ (Soldatov and Borogan, 2015: 278). Phone companies denied any involvement,
and this blunt counter-insurgency strategy, roundly ridiculed by citizens, only served to
embolden the protesters even further. The irony is that in a democratic country citizens
might actually be more intimidated by receiving similar messages, simply because per-
ceived freedom and security might make them less averse to risk.
These comparisons between post-Soviet and western contexts, while somewhat spec-
ulative, are necessary. It is easy to critique the disinformation approaches discussed
throughout this essay as examples of authoritarian attempts to control new media plat-
forms, but it is not that difficult to point to parallel tactics employed by democratic
regimes. To be sure, there is a unique set of conditions in Russia that differentiate this
case from the rest: weak rule of law, no independent judiciary, no freedom of speech or
human rights protections, telecommunications regulation in the interest of the powerful
and a propensity to silence political opposition (MacKinnon, 2012: 90). But for each of
the media control strategies in the post-Soviet context, there is an analogue or mirror
image in the western world. Both approaches share characteristics such as deregulation
of industry in a manner that gives more market power to favored corporations; increased
state power to impose special measures of surveillance during increasingly permanent
periods of emergency; a discourse of patriotism which shames dissenters and encourages
self-censorship; collaboration between government and private sector to develop and
implement technologies for surveillance; and increased secrecy about what governments
and corporations do with data collected from citizens, all in the name of security and
anti-terrorism. The media strategies that democratic states employ to surveil citizens –
strategies which frequently replicate those of non-democratic regimes – have been docu-
mented (Howard and Hussain, 2013). Furthermore, we know that surveillance is a
profitable emerging global industry, with democracies and non-democracies alike spend-
ing US$178 billion in 2010, and a projected US$2.7 trillion over the next decade (Hayes,
2012). A 2010 report indicated that in the United States alone there were 1931 private
firms doing classified work for 1271 government organizations (Hayes, 2012). Other
investigative reports suggest that agencies such as the FBI have possibly committed ‘tens
of thousands’ of legal violations while monitoring citizens (EFF, 2011).
Mejias and Vokuev 1039
Funding
The author(s) received no financial support for the research, authorship, and/or publication of this
article.
References
Alexander L (2015a) Open-source information reveals Pro-Kremlin web campaign. Global
Voices. Available at: https://globalvoices.org/2015/07/13/open-source-information-reveals-
pro-kremlin-web-campaign/ (accessed 19 February 2016).
Alexander L (2015b) Social network analysis reveals full scale of Kremlin’s Twitter bot campaign.
Global Voices. Available at: https://globalvoices.org/2015/04/02/analyzing-kremlin-twitter-
bots/ (accessed 19 February 2016).
Barry E (2014) Russia blocks web content amid tension over Ukraine. The New York Times, 13
March. Available at: http://www.nytimes.com/2014/03/14/world/europe/russia-blocks-web-
content-amid-tension-over-ukraine.html (accessed 14 April 2015).
BBC News (2014a) Russia enacts ‘draconian’ law for bloggers and online media. BBC News,
1 August. Available at: http://www.bbc.com/news/technology-28583669 (accessed 14 April
2015).
BBC News (2014b) Russian law bans swearing in arts and media. BBC News, 5th May. Available
at: http://www.bbc.com/news/world-europe-27286742 (accessed 14 April 2015).
Beard N (2014) Facebook and Twitter ‘ready for complete block’ after refusing further censor-
ship, Rain TV reports. The Calvert Journal. Available at: http://calvertjournal.com/news/
show/3482/facebook-and-twitter-ready-for-complete-block-after-refusing-further-censor
(accessed 14 April 2015).
1040 Media, Culture & Society 39(7)
Berkman F (2014) Russia blocks Pro-Ukraine groups on social media. Mashable. Available at:
http://mashable.com/2014/03/03/russia-ukraine-internet/ (accessed 14 April 2015).
Booton J (2014) Twitter to offer advertising services in Ukraine. MarketWatch. Available at: http://
www.marketwatch.com/story/twitter-to-offer-advertising-services-in-ukraine-2014–;08–26
(accessed 14 April 2015).
Chen A (2015) The agency. The New York Times, 2 June. Available at: http://www.nytimes.
com/2015/06/07/magazine/the-agency.html (accessed 19 February 2016).
Danilova M (2014) Truth and the Russian media. Columbia Journalism Review. Available at: http://
www.cjr.org/behind_the_news/truth_and_russian_media.php (accessed 14 April 2015).
Deleuze G (1997) Negotiations 1972–1990. New York: Columbia University Press.
Diamond L (2012) Liberation technology. In: Diamond L and Plattner MF (eds) Liberation
Technology: Social Media and the Struggle for Democracy. Baltimore, MD: Johns Hopkins
University Press, pp. 3–17.
Dougherty J (2014) Everyone Lies: The Ukraine Conflict and Russia’s Media Transformation.
Cambridge, MA: Shorenstein Center on Media, Politics and Public Policy, John F. Kennedy
School of Government, Harvard University. Available at: http://shorensteincenter.org/every-
one-lies-ukraine-conflict-russias-media-transformation/ (accessed 14 April 2015).
EFF (2011) Patterns of misconduct: FBI intelligence violations from 2001-2008. Electronic
Frontier Foundation. Available at: https://www.eff.org/wp/patterns-misconduct-fbi-intelli-
gence-violations (accessed 7 June 2016).
Etling B, Faris R, Palfrey J, et al. (2010) Public Discourse in the Russian Blogosphere: Mapping
RuNet Politics and Mobilization. Cambridge, MA: Berkman Center for Internet & Society
at Harvard University. Available at: https://cyber.law.harvard.edu/publications/2010/Public_
Discourse_Russian_Blogosphere (accessed 9 October 2015).
Garmazhapova A (2013) Где живут тролли. И кто их кормит [Where the trolls live. And who
feeds them.]. Novaya Gazeta. Available at: http://www.novayagazeta.ru/politics/59889.html
(accessed 19 February 2016).
Gray J (2015) Under western eyes. Harper’s Magazine, January, pp. 11–16.
Gregory PR (2014) Putin’s new weapon in the Ukraine propaganda war: internet trolls. Forbes.
Available at: http://www.forbes.com/sites/paulroderickgregory/2014/12/09/putins-new-weapon-
in-the-ukraine-propaganda-war-internet-trolls/ (accessed 14 April 2015).
Grove T (2013) Russia passes anti-gay law, activists detained. Reuters, 11 June. Available at:
http://www.reuters.com/article/us-russia-gay-idUSBRE95A0GE20130611 (accessed 1 March
2016).
Harding L (2014) Putin considers plan to unplug Russia from the internet ‘in an emergency. The
Guardian, 19 September. Available at: http://www.theguardian.com/world/2014/sep/19/
vladimir-putin-plan-unplug-russia-internet-emergency-kremlin-moscow (accessed 14 April
2015).
Hayes B (2012) The surveillance-industrial complex. In: Ball K, Haggerty K and Lyon D (eds)
Routledge Handbook of Surveillance Studies. New York: Routledge, pp. 167–175.
Herman ES and Chomsky N (1998) Manufacturing Consent: The Political Economy of the Mass
Media. New York: Pantheon.
Howard P (2010) The Digital Origins of Dictatorship and Democracy: Information Technology
and Political Islam, 1st edn. New York: Oxford University Press.
Howard PN and Hussain MM (2013) Democracy’s Fourth Wave?: Digital Media and the Arab
Spring. New York: Oxford University Press.
Korrespondent.net (2012) In Ukraine robot commentators manipulated discussions on the
Internet [В Украине проплаченные комментаторы манипулируют дискуссиями в
интернете]. Korrespondent.net, 24 September. Available at: http://korrespondent.net/ukraine/
Mejias and Vokuev 1041
politics/1398610-v-ukraine-proplachennye-kommentatory-manipuliruyut-diskussiyami-v-
internete-freedom-house (accessed 19 April 2015).
Kozlov V (2013) Russia amends anti-piracy law to specify procedure for blocking illegal con-
tent. The Hollywood Reporter. Available at: http://www.hollywoodreporter.com/news/russia-
amendments-anti-piracy-law-661475 (accessed 14 April 2015).
Kozyrev M (2014) Цифра дня: сколько экстремистских сайтов заблокировал Роскомнадзор
[Figure of the day: how many extremist websites were blocked by Roskomnadzor]. Apparat.
Available at: http://apparat.cc/news/rcn-vs-extreme/ (accessed 14 April 2015).
Leshchenko S (2014) The media’s role. Journal of Democracy 25(3): 52–57.
Lewis JA (2014) Reference Note on Russian Communications Surveillance. Washington, DC:
Center for Strategic and International Studies. Available at: http://csis.org/publication/refer-
ence-note-russian-communications-surveillance (accessed 22 October 2015).
Lipman M (2014) Russia’s nongovernmental media under assault. Demokratizatsiya 22(2):
179–190.
Lokot T (2014) Russian social networks dominate in Ukraine despite information war. Global
Voices. Available at: http://globalvoicesonline.org/2014/09/01/ukraine-russia-social-net-
works-information-war/ (accessed 14 April 2015).
Lysenko VV and Desouza KC (2014) Charting the coevolution of cyberprotest and counteraction:
the case of former Soviet Union states from 1997 to 2011. Convergence: The Journal of
Research into New Media Technologies 20(2): 176–200.
McChesney RW (2014) Digital Disconnect. New York: The New Press.
MacKinnon R (2012) China’s ‘networked authoritarianism’. In: Diamond L and Plattner MF (eds)
Liberation Technology: Social Media and the Struggle for Democracy. Baltimore, MD: Johns
Hopkins University Press, pp. 78–92.
Mejias UA (2013) Off the Network: Disrupting the Digital World. Minneapolis, MN: University
of Minnesota Press.
Miazhevich G (2015) Sites of subversion: online political satire in two post-Soviet states. Media,
Culture & Society 37(3): 422–439.
Mirowski P (2013) Never Let a Serious Crisis Go to Waste: How Neoliberalism Survived the
Financial Meltdown, 1st edn. London: Verso Books.
Morozov E (2012) The Net Delusion: The Dark Side of Internet Freedom. New York:
PublicAffairs.
Pariser E (2012) The Filter Bubble: How the New Personalized Web Is Changing What We Read
and How We Think. New York: Penguin Books.
Shevtsova L (2014) The Russia factor. Journal of Democracy 25(3): 74–82.
Soldatov A and Borogan I (2015) The Red Web: The Struggle between Russia’s Digital Dictators
and the New Online Revolutionaries. New York: PublicAffairs.
Sparks C (2007) Extending and refining the propaganda model. Westminster Papers in
Communication and Culture 4(2): 68–84.
Standish R (2014) Is Vladimir Putin covering up the deaths of Russian soldiers in Ukraine. Foreign
Policy. Available at: http://foreignpolicy.com/2014/09/01/is-vladimir-putin-covering-up-the-
deaths-of-russian-soldiers-in-ukraine/ (accessed 14 April 2015).
Stopfake.org (2014a) Fake: nationalists prevented paramedic from saving a wounded. Available at:
http://www.stopfake.org/en/fake-nationalists-prevented-paramedic-from-saving-a-wounded/
(accessed 19 February 2016).
Stopfake.org (2014b) FAKE: neo-fascists in Lviv were beating up an elderly lady, who was going
to put flowers on Lenin’s monument. Available at: http://www.stopfake.org/en/fake-neo-
fascists-in-lviv-were-beating-up-an-elderly-lady-who-was-going-to-put-flowers-on-lenin-s-
monument/ (accessed 19 February 2016).
1042 Media, Culture & Society 39(7)
Stopfake.org (2014c) Fake photos appeared on the Internet, of children who were supposedly
killed in Eastern Ukraine. Available at: http://www.stopfake.org/en/fake-photos-appeared-
on-the-internet-of-children-who-were-supposedly-killed-in-eastern-ukraine/ (accessed 19
February 2016).
Stopfake.org (2014d) Snapshot of movie The Brest Fortress is being presented as a photo of
Donbass. Available at: http://www.stopfake.org/en/snapshot-of-movie-the-brest-fortress-is-
being-presented-as-a-photo-of-donbass/ (accessed 19 February 2016).
Van der Velden L (2015) Leaky apps and data shots: technologies of leakage and insertion in
NSA-surveillance. Surveillance & Society 13(2): 182–196.
Vandiver J (2014) SACEUR: allies must prepare for Russia ‘hybrid war’. Stars and Stripes.
Available at: http://www.stripes.com/news/saceur-allies-must-prepare-for-russia-hybrid-
war-1.301464 (accessed 9 October 2015).
Walker S (2015) Salutin’ Putin: inside a Russian troll house. The Guardian, 2 April. Available
at: http://www.theguardian.com/world/2015/apr/02/putin-kremlin-inside-russian-troll-house
(accessed 19 February 2016).
Wu T (2010) The Master Switch: The Rise and Fall of Information Empires. New York: Alfred
A. Knopf.