You are on page 1of 9

1

Political Rabbit Holes: An Analytical Approach to Agenda Setting Theory

and Political Opinion

Danielle L. Annecston

Department of Emerging Media Studies, Boston University

EM 777 B1: Masters Collaboratory Project

Dr. Kelsey Prena Ellis

October 6, 2020
2

The act of participating in a social media experience today very closely relates to the

feeling of falling down a rabbit hole. Rabbit holes are all-encompassing alternate realities that

leave fallers feeling disoriented and offbeat. The daily interaction with social media is the sense

of falling, with the alternate reality being use of social media platforms. The apprehension is the

unknown on how people will land on the other side of social media usage. This leads the guiding

question for this study: as social media becomes a primary source of retrieving daily information,

how does the interaction with platforms affect us?

In a time when people get their daily updates on politics from the President’s tweets, we

should question where personal political opinions come from. To best answering this question,

we need to study the effect of social media on political opinions. This papers thesis is through

analyzing the relationship between the alt-right movements messaging on social media and the

aspects of filter bubbles, we can understand how agenda setting affects political opinions. Once

the effect is studied, then the theoretical framework can then be applied to emerging

recommender system social platforms, such as TikTok. We must go down the rabbit hole to see

how political opinions are developed through a social platform to understand the effect social

media has on human opinions.

Literature Review

Agenda Setting and Social Media

Distributing mass media news content in microsocial forms, such as through channels of

social platforms, is one of the most prevalent forms of developing political opinions. Kiousis et

al. assert that agenda setting theory describes the ability media has on influencing what is salient

information in a public agenda (2006). Feezell builds on this by pointing out social platforms

instigate a role in deciding salient information to the public agenda (2018). It has then been
3

demonstrated that political opinion can also be affected by agenda setting through a social

platform. Feezell proves this by a longitude study of exposing college aged Facebook users to

political or nonpolitical media over a time period of 75 days (2018). Participants exposed to

political media that had indicated no interests in politics at the beginning of the experiment felt

strongly about politics by the end (Feezell, 2018).

To further explore the idea of political agenda setting through social platforms, we must

compare young users to older users. Kiousis et al. explore the relationship between age and

political development by studying high school students participating in Kids Voting US

campaign (2006). Their study finds that students’ peers greatly influence their opnions (Kiousis

et al., 2006). During the experiment social platforms were not a necessity of daily life, but the

influence of peer opinion becomes even more prevalent when interacting on a social platform

due to the interactivity element presented online. To understand the development of political

opinions in a social context, then we must look into how agenda setting affects youth’s political

opinion with social platforms. Studies have also only focused on how agenda setting influence

will affect either younger audience opinion or older audience opinion. The missing gap is

comparing the extent of which age group will be affected more. This is especially useful due to

the rise in gaining information from social platforms and the gravitation of youths 16 to 24

toward using social platforms every day (Weimann & Masri, 2020, p. 4). Based on this

information, this leads to my first hypothesis: H1: Younger audience members opinions will be

more influenced than older audience members opinion.

Filter Bubbles

The consumption of everyday media within an interactive digital social platform is

tailored to viewers preferences. Social platforms use recommender systems to personalize the
4

information they consume (Nagulendra & Vassileva, 2014). Algorithms pick up key data from

users as they use social platforms, then provide content based on knowledge of their personal

history on the platform (Ahmed et al., 2020). Based on the knowledge, algorithms act on

deciding what is salient content for the user and filters the less salient information away from the

user’s platform (Nagulendra & Vassileva, 2014). This process of information prioritizing is

unknown to the user (Nagulendra & Vassileva, 2014). Users then continue to cycle consume

information the algorithm filters to be salient as a form of keeping them on the platform; they

will continue consuming if they enjoy what they consume. This cycle of information is referred

to as the filter bubble – a term introduced by Eli Pariser in 2011 (Ahmed et al., 2020). It defines

the echo chamber a user lives in based on the personalized information provided to them every

day (Bryant, 2020). Because of this effect, users miss the “flexibility and openness” that comes

with differences in content (Bryant, 2020, p. 88).

Most studies around filter bubbles have looked into the positive and negative effects of

filter bubbles versus overwhelming consumption of information. Some argue that being able to

select the elements of personalization in filter bubbles brings an awareness to the user and “a

feeling of control over their data streams” (Nagulendra & Vassileva, 2014, p. 1). Even by

controlling the personalization effect, the filter bubble still exists since information is being

filtered out. There’s a missing gap of observing the effect a user has while within the filter

bubble – especially in a topic as polarizing as politics. This leads to the first researching guiding

question: RQ1: Can we observe the effects of agenda setting filter bubbles?

Research commonly finds that filter bubbles do not determine opinions, especially in a

sense of politics since most people use searching methods to retrieve their information instead of

gaining it from feeds (Blank et al., 2017). Ahmed et al. researched if filter bubbles can lead to
5

ideological political enforcement by agenda setting through YouTube’s search algorithm to

contrast this argument (2020). The study focused on participants who had different political

opinion on the 2016 election. Entering search requests based on their political views, they tested

if YouTube would continue to recommend videos related to the political search request (Ahmed

et al., 2020). Participants would then watch a few of the recommended videos. Results showed in

this study that participant political ideology did not change, but video recommendations did

continue to align with the original search terms related to their political views (Ahmed et al.,

2020). This is problematic as users are more likely to trust information similar to their own

opinion rather than search out a differential point of view on the topic at hand (Ahmed at al.,

2020). This could be detrimental to an audience that has no political opinion that watched a

political video for the first time, especially when the agenda setting in the video is set by the

persuasive alt-right movement. Drawing on this reasoning provided above, I propose another

hypothesis: H2: There will be an association between conveyed agenda setting and audience alt-

right opinion.

Red-Pilling and Social Media

The alt-right movement refers to “a political ideology that centers on one or more of the

following elements: strident nationalism, fascism, racism, anti-Semitism, anti-immigration,

chauvinism, nativism, anti-LGBTQ, and xenophobia” (Weimann & Masri, 2020, p. 2). A 2017

survey conveys that only 9% of the US population identifies as part of the alt-right movement

(Bryant, 2020, p. 89), but the alt-right movement created a strong community base online. The

community base hones messaging on social platforms by appealing to the esthetic of subcultures

in misogyny, trolling, and gaming (Weimann & Masri, 2020). Recruitment to their movement

happens across all social platforms, often referring to it as “red-pilling”: the conversion of
6

someone to “fascist, racist, and anti-Semitic beliefs” (Bryant, 2020, p. 88). The group distributes

their agenda ideology through texts, memes, and videos on social platforms (Weimann &. Masri,

2020). Based on this information, it leads this study to another research question: RQ2: How

does alt-right opinion influence effects conveyed agenda setting on alt-opinion? To explore this

further, we must look at the social platform YouTube.

YouTube is seen as one of the most popular video-sharing platforms (Preece & Rotman,

2010), making it essential to the alt-right movement that conveys agenda setting through visuals.

By using recommender filter bubbles (Ahmed et al., 2020), viewers easily enter in an echo

chamber of recommendations from the alt-right movement. Research also proves that the

YouTube algorithm uses categorical terms used to describe the contents of videos, video tags, to

determine the next video in the filter bubble (Ahmed et al., 2020). An alt-right conveyed agenda

setting filter bubble begins with a viewer watching a video that makes no mention of the political

views in the title, but the creator has mentioned the conveyed agenda in the video tags. This

leads an unassuming viewer into consuming information from sources they trust and believe only

that to be true since no other information is introduced into the echo-chamber. The method of the

algorithm deciding what information is salient occurs in a way that is not known to the viewer

(Nagulendra & Vassileva, 2014), meaning a viewer can be caught in an agenda setting filter

bubble on the platform YouTube without their knowledge.

Based on the tactics the alt-right movement uses on YouTube to distribute their ideology,

a comparison of their methodology can be made to other social platforms that use recommender

systems to protect from information overload. The pivotal connection to make in 2020 is to

TikTok. The platform is being regarded as the fastest growing application in 2020 with over 2

billion downloads since curation and 90% of users saying they us the platform daily (Weimann
7

& Masri, 2020). TikTok is also noted to have a large sum of younger audience with 41% of

users being between the age of 16 to 24 (Wiemann & Masri, 2020, p. 4). Alt-right movement

accounts have distributed information through TikTok trends with gaining views upwards of

2,400 on a single video (Weimann & Masri, 2020, p. 9). Due to the fast emergence of TikTok in

2020, little studies have been conducted of the platform. Weimann and Masri take a first step by

looking into the dark side of TikTok (2020), but we lack an understanding the influence the dark

side has on daily user opinions between 16 to 24 (Weimann & Masri, 2020, p. 4). Based on this

cross comparison of YouTube user information consumption, it leads this study to the last

hypothesis: H3: Those who are more dependent on TikTok for information will be more

influenced by conveyed agenda setting.

Conclusion

By applying agenda setting theory to filter bubbles naturally built into social media

algorithms, we may examine how much social platforms are curating the personal opinion.

While the example of the alt-right movement’s online recruitment reveals cause, we must push

further to understand effect. Most established social platforms notice the effect of agenda setting

in filter bubbles and set community guidelines to try and protect users (Weimann & Masri,

2020). In emerging media platforms, there has yet to be enough time to establish which

guidelines help. This is why we must explore the political rabbit hole of TikTok where young

viewers gain their everyday information.

This calls for more research on the YouTube algorithm, the TikTok algorithm,

technological determinism of emerging social platforms, and alt-right movement recruitment

differences based on ages.


8

References

Ahmed, S., Cho, J., Hilbert, M., Liu, B., & Luu, J. (2020). Do search algorithms endanger

democracy? An experimental investigation of algorithm effects on political polarization.

Journal of Broadcasting & Electronic Media, 64(2), 150-172. doi:

https://doi.org/10.1080/08838151.2020.1757365

Blank, G., Dubois, E., & Dutton, W. (2017). Social shaping of the politics of internet search and

networking: moving beyond filter bubbles, echo chambers, and fake news. Quello Center

Working Paper No. 2944191, 26. doi: http://dx.doi.org/10.2139/ssrn.2944191

Bryant, L. (2020). The youtube algorithm and the alt-right filter bubble. Open Information

Science, 4(1), 85-90. doi: https://doi.org/10.1515/opis-2020-0007

Feezell, J. (2018). Agenda setting through social media: the importance of incidental news

exposure and social filtering in the digital era. Political Research Quarterly, 71(2), 482-

494. doi: 10.1177/1065912917744895

Kiousis, S., McDevitt, M., & Wu, X. (2006). The genesis pf civic awareness: agenda setting in

political socialization. Journal of Communication, 55(4), 756-774. doi:

https://doi.org/10.1111/j.1460-2466.2005.tb03021.x

Nagulendra, S., & Vassileva, J. (2014). HT’ 14: 25th ACM Conference on Hypertext and Social

Media. Association for Computing Machinery. doi:

http://dx.doi.org/10.1145/2631775.2631811

Preece, J., & Rotman, D. (2010). The ‘wetube’ in youtube – creating an online community

through video sharing. International Journal Web Based Communities, 6(3), 317-333.

doi: https://doi.org/10.1504/IJWBC.2010.033755
9

Weimann, G., & Masri, N. (2020). Research note: spreading hate on tiktok. Studies in Conflict &

Terrorism, 14. doi: https://doi.org/10.1080/1057610X.2020.1780027

You might also like