You are on page 1of 2

Discussion Paper

POLISCI 3395
Why Should We Care about Propaganda in Communication?
The article "Why Should We Worry About Propaganda in Communication?" focuses on
behavioral changes caused by online political propaganda (OPP). There is an implicit assumption
that OPPs are more effective and efficient than propaganda in the physical world. The collection
and sale of personal data for profit by social media corporations are enablers of OPP's growing
influence of OPPs. Individual personal data include raw data on platform interactions, as well as
additional personal data sets that can be processed using algorithms based on predictive behavior
modelling. These predictive algorithms aid in achieving the OPP's political purpose by
establishing certain behavioral patterns that coincide with the OPP's political goal. The OPP can
utilize these data to find different behavioral patterns and manipulate them to align with its
political goals.
Advertisements on social media platforms are utilized for this purpose because they try to
influence behavioral change through conscious or subconscious persuasion and are the
platform’s  primary source of revenue. The author takes a negative stance about OPP in the
discussion of the difficulties in distinguishing between inorganic and organic content on social
media platforms and the inadequate empirical research and analysis on the degree of implications
of  behavioral changes  in online advertising. In this context, organic content refers to unpaid
user-generated content that the algorithm of the respective social media platform pushes.
Inorganic content refers to content generated by political bots, sock puppets, and coordinated
groups of influencers, which frequently misleads the algorithm to push the content more because
of its popularity, thereby creating delusions through disinformation. Furthermore, advertising has
become more complicated because social media users can promote their accounts or content
without anyone, including the platform and the algorithm, realizing the inorganic nature of the
content. As a result, research and analysis of behavioral change as a result of online
advertisement is difficult to establish because inorganic content is not considered as
commercials, but rather as paid promotion work, which in the context of OPP is for a political
objective.
Difficulty in the collection and analysis of empirical data also stems from the inability to
characterize and gauge the uneven and wide-ranging sociological repercussions of behavioral
changes. The article contends that a lack of scientific data and literature does not imply the
absence of  behavioral change due to OPP, and therefore, should not be used to postpone
concrete efforts to prevent these behavioral changes. The author contends that delaying response
to OPP and its intended and unintended behavioural modification implications cannot be allowed
to happen, and that political disinformation and state sponsored smear campaigns, such as those
against journalists and human rights activists, should be combated despite a lack of
holistic understanding of implications of behavioural modification. The author finishes by
arguing that purely empirical data and analysis may be insufficient because the theories
informing this analysis were developed in a different social and technological context. However,
emphasis should be on second order changes or system changes that aggregate individual
changes. Analysis of second order changes is more useful as they are more easily observable and
may better explain the broad implications of behaviour modification than examination of
individual behavioural changes.
I believe the author is correct about the reason for insufficient empirical data on behavioural
changes through advertising and OPP because the implications of these changes on a society are multi-
dimensional and can have long-term consequences for a specific society that cannot be gauged in a
controlled environment due to the existence of multiple uncontrollable variables. Because of events such
as the 2021 US Capitol riots, I agree with his argument that this inadequacy should not be used to justify
delaying a response to OPP manipulations. Manipulation of OPP, such as disinformation or deep fakes,
has the potential to have a metamorphic impact on a society, and due to the  lack of analysis of
behavioural changes at both the individual and systemic levels, gauging societal implications over a long
term period is impossible. Not to mention that OPP can also include international political or social
issues where the consequences are not limited to a specific geographic location, making it even more
difficult to understand the multiple variables of causal and consequential nature in context to how
individual or systemic behavioural change may reflect on the transformation of society. 

Data protection and encryption legislations, which would prohibit arbitrary access to data for the
purpose of selling it, are one preventive measure not discussed in the article. In this case, the data could
be collected and stored on a system based on secure technologies like blockchain technology, which is
currently being used by cybersecurity firms, governments, and large corporations for data protection
from cyberattacks. The argument here is that social media platforms can develop their own algorithms to
process data stored on secure technologies such as Blockchain. Any such legislation should also ensure
that the adoption of such technology is not prohibitively expensive, as this would encourage social
media companies to find a way to sell the information in order to cover the costs of such a transition. As
a result, the legislation would be rendered ineffective in this scenario.

A conundrum arises from the discussion of sock puppets and political bots. The normative
argument is that social media companies should include in their algorithms the ability to identify low use
accounts on their platforms as well as patterns of use such as the content shared by them, the duration
of log ins, and other general patterns of activity. The conundrum here is that such inclusion would raise
concerns about data collection, privacy, and data security, given that the article mentions social media
companies selling data for profit. Even if these concerns could be addressed by enacting data protection
legislation similar to that discussed previously, the development of an algorithm would necessitate the
input of information. The initial input is also difficult because no political organisation conducting an OPP
would volunteer their information or data, and even if they did, current literature on OPP does not
specify whether they are static or dynamic in nature. Thus, the input data would be insufficient in
countering al tools and techniques used by OPPs for disinformation and other AI related manipulations.
Moreover, different OPP's tools and techniques may differ depending on their objective and method. A
smear campaign, for example, may necessitate deep fakes or edited videos, whereas a disinformation
campaign may only necessitate the creation of an echo chamber by political bots and sock puppets. To
summarize, there is no one-size-fits-all solution for responding to OPP tools and techniques.

You might also like