Professional Documents
Culture Documents
Oren Soffer
opinion leaders with the current notion of personalized algorithms, arguing that
opinion leaders and algorithms both function as gatekeeping agents. I also discuss
the nature and role of peer groups in the two cases, arguing that while in the
original theory, groups were seen as relatively solid (family, friends, and work
colleagues), groups in the algorithmic era are much more liquid, transforming
according to data inputs and users’ behavior. Finally, the article also considers
eras, as well as the different social settings and public awareness in the second step
The current media environment poses new and comprehensive challenges for communication
evolved and matured throughout the 20th century—have seemingly been undermined.
(Chaffee & Metzger, 2001; Napoli, 2010; Soffer, 2010, 2013). My underlying assumption in
this theoretical paper is that "old" media theories—the array of ideas relating to traditional
mass communication—are important tools in the critical study of digital era. How we adapt
1
such theories to preserve their relevance often provides important indicators of the
similarities and differences between the traditional mass communication era and the current
data environment.
I will focus in this study on a seminal theory in communication studies: that is, the
two-step flow of communication, usually attributed to Katz and Lazarsfeld (1955/1966). The
innovation of this theory lies in its highlighting the role of interpersonal relations in the
mediation of mass communication outlets. The theory assumes that most people get their
information not directly from the media but instead through personal sources, thus
emphasizing the role of social groups and opinion leaders in this mediation process.
While attempts to examine the role of the two-step flow theory in the digital media
environment have so far focused mainly on the socio-political role opinion leaders play in
social networks, I will argue that the mediation of opinion leaders is often replaced by
calculation imposes grouping clusters onto individuals based on data calculations, taking into
account users’ online history. The role and definition of the group or relevant other peers
have changed in the digital environment. Group clusters are much more instrumental,
temporal, dynamic, and present-oriented than the groups of families, friends, or work
colleagues imagined in the original theory. Yet, groups have not lost their relevance, playing
I argue that the attempt to "personalize" the algorithmic flow—that is, giving it an
interpersonal character such that it seems the algorithm knows the users and their tastes—
continues and strengthens trends in traditional mass communication: to give the audience the
feeling of being personally addressed by the mass media (Horton & Whol, 1956; Scannell,
2
two-step flow theory, I highlight the crucial role of algorithms in the current socio-political
sphere as well as the risks of putting such an integral part of decision making in the public
The idea that mass communication content is mediated through personal contacts emerged in
studies of the US presidential election campaign in the 1940s. These studies found that
personal relationships had a stronger effect than the mass media on decision making during
the campaign. Voters revealed that their decisions were often influenced by family members,
(distinguished by age, occupation, or political orientation) (Katz, 1957). These results pointed
each group, certain people played a central role in mediating relevant information between
the mass media and the group (Katz & Lazarsfeld, 1955/1966; Katz, 1957). These leaders,
having relatively large circles of contacts and interest in specific domains (Winter &
Neubaum, 2016), were asked their opinions on issues being discussed in the media, and in
this way influenced others. This mediated influence of mass communication content differed
The insights of scholars like Katz and Lazarsfeld led others to the acknowledgment of
the role of social contacts in mediating information that originated in mass communication,
revealing the important communicational role of opinion leaders. These leaders are an
integral part of each social group and stratum, present in its everyday give-and-take, and their
opinions resemble the views of the people they inform and influence (Katz & Lazarsfeld,
3
1955/1966; Katz, 1957). They are considerably more exposed than other group members to
formal mass communication (Katz, 1957). The two-step flow theory assumes that media
contents often "flow from radio and print to opinion leaders and from them to the less active
sections of the population" (Katz & Lazarsfeld, 1955/1966, p. 32). This flow process is
relevant with regard not only to public affairs issues, but also to other fields, like marketing
or fashion. The characteristics of opinion leaders vary from one field to the next, and leaders
and followers of the same social group might exchange roles in different contexts or spheres
the opinion leader’s personal (and therefore also often the group’s) agenda and beliefs.
Opinion leaders were therefore active gatekeepers in the process of exposing media content
to the “passive” majority in society (Laughey, 2007). Their authority was seen to be anchored
in their personal characteristics (for example, in fashion, youthfulness played a major role)
and in the knowledge they were thought to hold (through for example their personal contacts
outside the group). They were seen as role models for other group members: "In other words,
it takes two to be a leader—a leader and a follower" (Katz, 1957, p. 74). While opinion
leaders might be more interested than other group members in matters in which they were
“experts,” it is unlikely that their level of interest will differ dramatically (Katz, 1957).
The two-step flow had an important role in directing scholarly attention beyond the
study of mass media effects on cognitive, emotional, and behavioral levels, which occur
following the exposure of a specific individual to a specific medium. The theory pointed out
that any comprehensive evaluation of the impact of mass media requires taking into account
interpersonal discussions about media content that occur over time (Jensen, 2016). The media
impact is therefore not limited to its audience, as the media contents diffuse through
interpersonal relationships to groups that were not exposed to the original message. This
4
undermines the distinction that often stands at the basis of media effect studies—that is
between those who were exposed to a certain message and these who were not. As Vu and
Gehrau (2010) show in their study of agenda diffusion, interpersonal discussion about certain
content from mass media strengthens the issue’s importance among these who were exposed
to the content, affects the issue’s estimated importance among these who were not exposed to
it, and sometimes even motivated them to consume the content that was the object of the
interpersonal discussion.
Along with recognition of its importance, the two-step flow theory also received
severe criticism (Himelboim, Gleave, & Smith, 2009). Weimann’s (1982) critique of the
theory’s assumptions referred to several issues, among them its neglect of the possibility of a
one-step flow from mass communication to certain people, as well as its focus on the vertical
flow from the media to the people while ignoring horizontal flow of opinion-sharing among
peers or non-media sources. As I will discuss later in the article, these lines of criticism are
era.
through the distinction between face-to face and technological-mediated communication. She
process embodied in a specific context and human negotiation of media content (Lievrouw,
2009). The next important touchstone she identifies in the relationship between face-to-face
5
Lievrouw (2009), the new media "crisis" encouraged a scholarly discourse that "focuses on
The seminal role of the two-step flow theory in communication studies, along with its
networked, contextual characteristics, drew scholarly attention to the relevance of this theory
in the new media environments—in particular regarding the role of opinion leaders in social
networks such as Facebook and Twitter. As Karlsen (2015) explains, the diffusion of content
in social networks depends on active users, such as opinion leaders. Thus, the logic of
diffusing content from active leaders to more passive individuals resembles the diffusion both
news content, researchers found that "news shared by friends on Facebook is perceived as
more trustworthy than stories received directly from the outlet" (Turcotte, York, Irving,
Scholl, & Pingree, 2015, p. 529). Furthermore, opinion leaders’ recommendations of specific
news articles increased followers' future attention to other news coming from the same news
source. Another study (Karlsen, 2015), examining the diffusion of content from Norwegian
politicians and political parties on Facebook, found that most followers of these politicians
and parties were opinion leaders—being more active in the online sphere and having more
friends and followers than other people. They were also more active in the offline sphere
Along with these empirical studies examining the relevance of the two-step theory in
the digital social network sphere, scholarly attempts were made to theoretically re-
conceptualize the flow of communication in the digital era. One of these was Bennett and
Manheim’s (2006) one-step flow theory. In their work, Bennett and Manheim examined
changes both in social and economic global structures (e.g., as discussed by Giddens [1991])
6
and in technology and the way users interact with it. Their analysis brought them to conclude
that current social-technological changes push towards individual isolation, undermining the
axioms of the two-step-flow theory and paving the way to a one-step flow of communication.
They suggest that group mediation, which stands at the core of the two-step flow, is much
less significant in the new media era than it was in the mass media era. Observing
individuals’ direct interaction with media outlets that are tailored or channeled to match their
characteristics, the authors argue that interaction between people and technology has
changed: "Their use produces an interaction, not among members of peer groups, but between
the technology and the individual audience member" (Bennett & Manheim, 2006, p. 216). In
the decade since Bennett and Manheim’s paper was published, scholarly awareness of the
role of new digital intermediations has increased. The somewhat naïve scholarly perception
of disintermediation in the digital sphere has eroded in part because of empirical evidence of
the continued relevance of intermediators in social media networks (Turcotte et al., 2015;
Karlsen, 2015; Bravo & Del Valle, 2017; Bergström & Belfrage, 2018).
Another attempt to theorize communication flow in the digital era can be seen in
Thorson and Wells’ (2016) “curated flow” approach. Their model takes into account
They distinguish among the following: journalistic curation, which is built on the
hands of the individual; socially curated flow, in which people choose the social
networks (and therefore the information) they are part of and exposed to; and algorithm
7
Thorson and Wells (2016)—while noting that strategic curation follows Bennet and
Manheim’s one-step flow model—at the same time identify social curation with the two-step
flow model. They also note that algorithmic curation overlaps with the other four types of
curation. I will focus here on the algorithmic flow, which indeed includes various levels of
mediation can be understood and theorized as a two-step flow, in which all the inputs,
including these related to the “calculated peer group,” are used to personalize the contents
provided to the user. Of course, as will be discussed in the following sections, deep
conceptual adaptations are required in the two-step flow theory in order to apply it to the
Algorithms are sets of defined codified steps that are meant to solve certain problems or
fulfill defined tasks (Diakopoulos, 2015). Their computational processes allow them to make
sense out of large sets of data. An algorithm "takes some value, or set of values, as input and
produces some value, or set of values, as output" (Cormen, Leiserson, Rivest, & Stine, 2009,
p. 5). In the last decade, algorithms have played an increasing role in the media sector in
general and in digital journalism in particular. Among other things, algorithms in media-
related corporations function to (1) determine the contents that users are exposed to and (2)
create content, sometime replacing, at least in part, human agents (Napoli, 2014).
I will refer here to the first category of algorithms. These include applications such as
search engines, which are used to prioritize and rank search results; to aggregate and filter
content, such as news stories or posts on social media platforms; and to recommend systems,
which are based on association of similar definitions as entities (Diakopoulos, 2015; Just &
Latzer, 2017; Diakopoulos, 2016). These algorithms influence the exposure of certain
8
perspectives and voices. Thus, search engines, which rank and organize information, make
some of this information prominent while rendering some of it invisible. This hierarchy in the
presentation of content is part of the construction of modern power relations (Lupton, 2015).
calculations, they are often created because of capital interest. Among other things, they are
used to improve the efficiency and accuracy of marketing by taking into account users' online
behavior. Algorithms construct reality in a way that reflects the interests of certain groups
and viewpoints (Kitchin, 2017). While some hoped that the Internet would decrease the
power of gatekeepers, algorithms perform data selection and affect how public opinion is
framed (Just & Latzer, 2017). The role of mass media in constructing social reality is well
acknowledged; however, as Just and Latzer (2017) note, the role of algorithms is different:
which operate with a time delay and are mostly targeted at well-defined general
publics and (mass) markets whose characteristics are known from limited data
households) compared to big data in the case of algorithmic reality mining and
active consumer input (e.g. feedback) and passive data (e.g. location-based,
active part in shaping broad aspects of everyday practices and attract wide social attention
(Kitchin, 2017; Seaver, 2013). Of course, this focus on technological artifacts as agents in
information retrieval and filtering did not start with the operation of algorithms. The actor-
network theory, implemented with regard to prior eras, posits that “the boundaries between
9
the technical and the social, and between human and machine capabilities, are frequently
contested and always negotiable” (Walsham, 1997, p. 477). Thus, the “networked” perception
algorithms, the complexity of interests and logic in content selection. Yet, it seems that the
algorithmic era further strengthens and formulizes this human / non-human melting pot
(Callon, 2004).
considered like human actors or institutions in their effect on social decisions—as they can
affect users’ preferences and behavior through technological means that are embodied in
emphasizes the multiple influences and forces at play that determine their operation. In fact,
the common metaphor referring to the human brain as a “black box,” emphasizing its
complexity and the non-transparent, abstract flow from inputs to outputs, is also applied to
algorithmic operation (Pasquale, 2015; Kitchin, 2017). The non-transparent and abstract
nature of algorithms starts with their coding, which usually occurs in the private sphere and
therefore is not transparent to the public. The creation of algorithms involves translating a
defined problem or task into operating principles, and then translating this set of principles
into an array of codes. While these processes are often imagined by the coders as natural and
abstract, the operationalization of the problems and tasks, as well of their pragmatic codified
solutions, relates to certain logic, culture, and political context (Kitchin, 2014).
The complexity and non-transparent nature of algorithms' operation also stems from
their operation in systems of other algorithms and actors. They are, according to Kitchen
set of relations including potentially thousands of individuals, data sets, objects, apparatus,
elements, protocols, standards, laws, etc. that frame their development” (p. 20). Furthermore,
10
algorithms that are based on machine learning are constantly evolving through the interaction
and implementation of new data (Kitchin, 2014; Cheney-Lippold, 2011). They have a
constantly changing nature; even their creators cannot entirely track, explain, or predict how
their choice of design will be translated into operation. Given this complexity, it is not
surprising that end user’s often lack understanding of how algorithms operate (Rader & Gray,
2015).
on the one hand, and the complexity of the multiple factors affecting their operation and
results, on the other, leads to calls for algorithmic transparency—to reveal the people who
control the algorithms, the criteria for algorithms' operation, information about the
characterstics of input data, the rate of error, and the possible sources of bias (Diakopoulos,
2015). We encounter such calls for algorithmic transparency with corporations’ interests and
desire to preserve trade secrets and avoid damaging the assets that contribute to their
competiveness (Diakopoulos, 2015; Diakopoulos & Koliska, 2016), as well as with concern
over overloading users with information they have no interest in (Diakopoulos & Koliska,
2016). As we will see, algorithms are being used as tools to personalize contents to which
users are exposed. In that, algorithms are in some ways fulfilling the role reserved for opinion
The need to personalize media content did not originate in the era of algorithms. In the early
days of the electronic media, the monological nature of mass media was recognized as a
problem that needed to be addressed. As Scannell (2000) notes, the BBC adopted the
principle that radio programming should not be aimed at a “mass” audience, but rather should
11
give listeners the feeling that it is directed at each individual. He refers to this strategy as
allegedly speaking directly to the individual listener. Such modes were adopted in an attempt
to create a personalized and intimate atmosphere in mass media. Thus, Horton and Whol
(1956) point to the attempt in electronic mass media to create a para-social relationship—a
Although they acknowledged that these relationships are "one-sided, nondialectical and
controlled by the performer" (Horton & Whol, 1956, p. 215), they suggested they can be
imagined in interpersonal terms. The illusion of an intimate dialogue between audience and
persona was promoted by the talk style, appealing to listeners as if they were friends, as well
as through the perception of the “subjective camera,” where the lens became the eyes of the
audience. This “mediated quasi-interaction” blurred the monological nature of mass media,
creating for listeners and viewers a kind of friendship with media personas (Thompson, 1995,
structures” and Horton and Whol (1956) as para-social mode—indicate the perceived
If we consider the two-step flow theory against this background, the second step in
the process is part of personalizing and adapting general monological mass media production
for a relatively small group of individuals. While the first step in this flow imagines the
transmission of the same content to the mass of people, the effect of such universal flow is
assumed to be limited. There is a need for personal mediation of the content, which is
achieved through the interaction between opinion leaders and group members. The opinion
leader, who has the same social background as the group members, can customize the
12
universal message to suit the group members' habitus and perceptions. Such a leader knows
how to re-frame messages: what to emphasize and what to downplay so that the information
will fit group members. If we interpret the two-step flow in this way, using Scannell’s (2000)
idiom, the second step of the flow takes the “for-anyone" content and further strengthens the
frame of the "as-someone" structure. The two-step flow can be seen as a supplementary
social mechanism meant to compensate for the lack of interpersonal conversation in the mass
media era. Generic mass communication outlets are considered as a starting point for
homogenous people. These people conduct full intergroup and interpersonal dialogue
(distinct from the perceived monological one-sided flow of information in the mass media) in
the same way we imagine discussions with our family, friends and colleagues.
In the digital era, however, personalization processes have new ramifications. Digital
media environments allow much easier and more effective tracking of users' behavior and
preferences than was possible in the mass media era (Sundar & Marathe, 2010). Algorithms
and personalization processes are often mentioned in the same breath. A major step in the
Personalization lets us hide annoying relatives on our Facebook feeds, list our
favorite microbloggers, and get updates from crucial RRS feeds. It means that
Google News might give pride of place to baseball, music, or left-wing politics
according to the reputation we establish. It means that Google Search orders its
results according to searches we've made before, through clicks collected by the
13
Algorithms, through recommendations and personalization, construct social realities. In this
way, they are similar to traditional mass communication. But algorithms are intended to
bypass or neutralize one of the main features of traditional media—that is, its mass character.
Algorithms personalize exposure to contents (Just & Latzer, 2016). Instead of allowing the
user to choose from a variety of contents, algorithms often curate the contents. Algorithms
respond in real time to evolving trends in consumer behavior, selections, online activities, and
content consumption, such as location and social contacts (Just & Latzer, 2016). In this,
algorithmic operations are meant to resemble interpersonal interactions, in the same way that
someone who knows the person (an opinion leader, for example) would say to them in a
certain moment and a certain context, "Here, this might interest you!" Perhaps the most
content for users, can be seen in the abandonment by some social networks (e.g., Facebook)
according to the expected interest users will have in them, where the time of posting is only
personalization, which relies on users’ inputs, implicit techniques infer users' preferences
from the data collected through monitoring their digital activity. Among others, these
organizations are using data collected on social networks to apply collaborative filtering in
order to preserve their site's relevancy. Some suggest that the use of such tools has the
potential to restrict the variety of ideas users are exposed to and to encourage ideological
segregation, creating hegemonic "echo chambers" (Sunstein, 2009). Worries about the often
hidden effect of filtering and personalization on the heterogeneity and bias in public
discourse, along with general trends of transparency meant to recover public trust in
14
journalism (Diakopoulos & Koliska, 2016), have boosted calls for algorithmic transparency
in news production.
As explained above, the second step in the flow of mass communication, according to the
two-step flow theory, is to personalize the generic mass communication content. In a similar
vein, algorithms often personalize the results that users are exposed to. We can consider, for
example, a cultural product, such as a new book or movie: A review of this product might
group. The opinion leader, who shares the same cultural habitus as other group members and
is acknowledged by them as a cultural authority, will share his or her opinion about the movie
or book. Using familiarity with the group members and their cultural tastes, the opinion
leader might encourage them to read the book or watch the movie. This kind of group
discussion takes a generic message, which appears on mass media, and customizes it to
individuals that form the group. The interpretation of the review in the mass media might
differ among different groups of the general population: according to age, socio-economical
important role in setting contemporary society’s agenda. In the digital era, the second step of
the communication flow can take place in social networks. In such frameworks, opinion
leaders still play an important role. Furthermore, there is no reason to believe that the
exposure to content on social networks would not percolate through interpersonal discussion
to wider circles than those that were initially exposed to the message, as was discussed in the
context of the original two-step flow. Taking this into account, we can return to the example
above of the dissemination of information or recommendation of the book or movie, but here
15
it is provided to the user through algorithmic processing (through active search or content
recommendation). On the surface, this manifests a direct streaming from the computerized
source to the user. As Bennett and Manheim (2006) observe, interpersonal and direct group
mediation of the content is missing in this process. The only players are the user and the
digital device. The speedy processing of algorithms and the intimacy of electronic device
being used strengthen the feeling of one-step direct flow. Moreover, the algorithmic
operation, which stands behind the information curation, is not visible. Algorithms run in the
background. As Introna (2016) notes, they are "organizing our lives and opportunities
without our explicit participation, and seemingly outside of our direct control" (p. 18). In fact,
evidence shows that some users are unaware of the algorithmic presence and its role in
content curation on social networks (Eslami et al., 2015). Yet, it would be wrong to ignore
algorithmic mediation, along with its social and grouping implementations, just because the
process is rapid and invisible. The fraction of a second that the algorithm runs behind the
scene has been described as a step of mediation: a technical step (de Vries, 2010). However,
it seems that this stage is far from having technological implications only.
While the two-step theory has been criticized for not taking into account unmediated
communication flow or different flow patterns such as horizontal flow in the digital era, there
is often no possibility of bypassing the algorithmic processing or changing the flow pattern.
As Napoli (2013) argues, the complexity of the digital fragmented environment results in a
total dependency of users on algorithms in the navigation process. In fact, navigation that is
The algorithmic mediated flow of communication usually occurs in the private sphere,
through individual use of a digital device. Yet, this interaction involves other algorithmic
identities. Algorithms construct reality through their choices. The perception that algorithm
results are personalized points to the calculation of a specific user’s character and behavior.
16
Yet, because algorithms predict behaviors in order "to tell the audience what interests them,
without relating to the actual interest of the audience" (Helberger, 2016, p. 193), they rely on
the calculation of digital behavior of relevant other users. Moreover, while the categorization
of algorithmic users is very dynamic, providing mobile "labels” that are maintained as long
they prove to be working (de Vries, 2010, p. 81), these categories, which need to be
translated and sold to advertisers, must correspond with traditional marketing categories.
Thus, as Cheney-Lippold (2011) argues, "online a category like gender is not determined by
one's genitalia or even physical appearance. Nor is it entirely self-selected. Rather, categories
of identity are being inferred upon individuals based on their web use" (p. 165). Gender
composition in algorithmic identity might change with time and data and present math-based
calculations of the two gender categories (Cheney-Lippold, 2011). Yet, in the end, such
algorithmic identities are used for marketing and will likely refer to traditional categories of
distinctions between men and women. In addition, certain algorithmic identities can be
I recently interviewed a scientist who works for an online radio company. When I
asked him about the playlist algorithm that picks which song to play next, he
corrected me: there is not one playlisting algorithm, but five, and depending on
how a user interacts with the system, her radio station is assigned to one of the five
master algorithms, each of which uses a different logic to select music. (p. 5)
The application of the two-step theory to the algorithmic operation helps us to understand the
changing characteristics of the role and meaning of the relevant peer group in the
communication flow. The group has not disappeared in the algorithms' mediation; but it is
changed. Following Zygmunt Bauman (2000), I argue here that the group in the algorithmic
process is much more liquid in its nature than that that was imagined in the original two-step
theory.
17
Bauman (2000) distinguishes between the modern and the late modern eras. He
describes modern social structures as “solid,” meaning that they maintain their characteristics
over time. However, in late modernity these structures are melting and changing their nature:
The solids whose turn has come to be thrown into the melting pot and which are in
the process of being melted at the present time, the time of fluid modernity, are the
policies on the one hand and political actions of human collectivities on the other.
(Bauman, 2000, p. 6)
Bauman’s metaphor regarding the liquidity of late modern social structures has been
applied to distinguish between modern structures and their undermining in late modern
culture, for example, in the concept of digital liquid consumption (Bardhi & Eckhardt,
2017), digital liquid language, which is a so-called lighter form of written language
identified by strong oral influences in chats or texting (Soffer, 2013) and to the
algorithmic culture in general (Introna, 2016). In a similar vein, while the groups of
family members, friends, and co-workers imagined in the two step-flow theory are
relatively solid and have clear characteristics in the modern social structure, the group’s
curation in algorithmic operation is much more fluid. The calculated peer group whose
digital behavior is used to personalize the results is constantly changing. Changes occur
algorithms’ codes. They are also constantly affected by the user’s own behavior, or even
changes in user behavior will result a shift in the algorithmic output and its identification
with a certain category. The same will occur in the case of changes in general social taste
or political orientation.
18
This perception of groups as liquid and constantly changing entities fits well into
(2011) argues, "[I]n cybernetic categorization these groupings are always changeable,
following the user and suggesting new artifacts to be visited, seen, or consumed" (pp. 175-
176). Categories created by algorithms do not stem from any social rationale, but rather from
pragmatic calculations (de Vries, 2010). Moreover, users, who interact with the digital device
as “individuals”, might be unaware of the grouping and categorization process. The group is
not "there" as it was in the two-step flow. It likely cannot even be imagined, because the
common denominator is usually unknown. This is quite different from the visible
contributing to national imagination. In the algorithm era, the grouping influence is expressed
in the specific content that users are exposed to and—perhaps even more importantly—
through the absence of other content that seems unsuitable for the algorithmic identity of a
given user (Pariser, 2011). Because the algorithm prioritizes content that fits one's
algorithmic identity, Facebook users, for example, might find themselves being fed political
views that match their own views, while the views of the rival camp become invisible.
It is also important to examine the authority of algorithms from the point of view of
the users. As discussed, in the two-step flow theory, opinion leaders are seen by group
members as having knowledge and expertise in the specific mass media content they are
mediating. The specialty of the opinion leaders, which varies across different fields, gives
them the authority vis-à-vis the group members. The source of algorithmic authority differs
from this, yet it is also crucial in garnering the trust of people in algorithmic curation.
Algorithms are far from being neutral: they classify people and nudge their behavior for
capital reasons (Kitchin, 2017). Yet, because algorithms involve machines and complex
calculations, they are often seen as objective and neutral (Introna, 2016). They enjoy the
19
objective aura of an automated, usually poorly understood, process that is operated without
The following table crystalizes the similarities and differences between the original
Table 1: Comparison of the the original two-step flow theory and its adaptation in the
algorithm era
20
Awareness of individuals High, as they take an active Potentially nonexistent, as
to the second step role in interaction the algorithm processing is
hidden
This comparison clarifies the important social and political role of algorithms in the current
communication flow, which had been reserved for human actors in the original two-step flow
structure. Human beings are not immune to manipulation. Yet, they have their own
independent beliefs and opinions. They have the capacity to be aware of fairness issues. They
mediatization into the hands of a machine is not only due to the algorithms' bias or
2011; Sunstein, 2009). Algorithms as instruments can be manipulated: they can be used
against the interests of those who created and own them and benefit those who stand behind
the manipulation. In this way, algorithms can act as a Trojan horse at the heart of the
As Arnoldy (2016) argues, the central role of algorithms as decision makers in trading
markets has created new opportunities for manipulation. He explains that traders have always
tried to manipulate each other; but in the algorithmic era, manipulations that would be easily
traced by human agents are not noticed by the "dumb machines." Moreover, the rise of
algorithmic traders along with new techniques to trick them challenges the definitions of
market manipulations and raises new questions about liability, regulation, legal statutes, and
the role of regulative institutions, as well as the technical possibilities of creating more
21
In the same way that we need to recognize the role of algorithms in trading markets,
we need to take into account their possible manipulation in the market of ideas and its
harmful effect on democracy. Here as well, algorithm manipulation continues a long legacy
of political deception and propaganda. What is new here is the central role played by
machines in making decisions about political discourse. And, similar to the case of marketing
manipulations, questions arise here about liability. This can be seen clearly in the 2016 US
presidential election and the reported Russian attempts to bias it (Shane, 2017). Of course, the
manipulators are the first to be blamed. But what is the responsibility of companies such
Facebook and Twitter whose platforms were manipulated? Can they do a better job
protecting their systems from future manipulations? What about the general socio-political
responsibility of giving algorithms such a central role in the communication process and
decision making? Is a regulative intervention needed or possible? All these questions, which
are already on our daily political agenda, are sharpened when we compare the role of
Discussion
It seems we should put some critical thought into the meaning of “algorithmic
personalization” as a popular and seemingly naturalized turn of phrase in social and scholarly
create value by providing users with information that matches their algorithmic identity and
preferences (Helberger, 2016). This phrase takes technology, along with the power
relationships behind it, and humanizes it using the terminology of interpersonal relationships.
Marcuse’s (1964) discussion on the functional use of language seems apt here: according to
Marcuse, the use of such language "closes the meaning of the thing, excluding other manners
personalization in a sense personifies it. This continues the trends and strategies of traditional
22
mass communication that aimed to give the transmitted contents an interpersonal flavor—the
between the media “persona” and members of the audience (Horton & Wohl, 1956).
As Lievrouw (2009) points out, the two-step flow theory should be seen as an effort “to
bridge the gap between interpersonal and 'mass' communication”(p. 304). The two-step flow
framed the mass media as being woven within a network of interpersonal group relationships.
While mass communication organizations are the gatekeepers, selecting the information
provided to the public, opinion leaders act as secondary gatekeepers, selecting from the
generic content posted by the media those items that are relevant for their group and
providing interpretations of this content. In a similar spirit, algorithms are seen today to act as
gatekeepers, aggregating the content to which users are exposed (Just & Latzer, 2017). In
fact, as Singer (2014) argues, algorithms often play a secondary gatekeeping role: “In a
traditional media environment, items ignored by an editor were not visible to the public at all;
they did not make it past the journalistic gate. In today’s environment, published items
ignored by users will have made it successfully through that gate but may still fail to reach
more than a handful of readers” (p. 66). The roles of opinion leaders in the traditional media
era and algorithms as secondary gatekeepers point to the social role Web algorithms are
supposed to play. In the spirit of Horton and Wohl’s (1956) theory of the para-social
relationship between the individual viewer and the media persona, algorithms can be
described as acting as para-opinion leaders through the personalization process they enact.
to "know" each user and curate the relevant information for them.
As I have argued here, at the core of this algorithmic interaction stands a process that
relies on the user’s behavior, liquid social categorizations, geographical location, and other
relevant social information. The liquidity of algorithmic mediation, which in its essence is a
23
process of decision making regarding the social and political public agenda, makes it
their digital devices through the lens of the two-step flow theory, the complexity of the
interaction and its social context are much clearer. Such a view also emphasizes the
continuity between the mass communication era and the digital era. As I argued in this study,
these two eras exhibit significant similarities, specifically between opinion leader's personal
mediation of mass media contents in the former, and the function of the algorithm
by algorithms.
References
high frequency trading. Theory, Culture & Society, 33(1), 29-52. doi:
10.1177/0263276414566642.
Bennett, W. L., & Manheim, J. B. (2006). The one-step flow of communication. The
ANNALS of the American Academy of Political and Social Science, 608(1), 213-232.
Bergström, A. & Belfrage, M. J. (2018) News in social media. Digital Journalism, 6(5), 583-
24
Bravo, R. S. & Del Valle, M. E. (2017). Opinion leadership in parliamentary Twitter
Callon, M. (2004). The role of hybrid communities and socio-technical arrangements in the
participatory design. Journal of the Center for Information Studies, 5(3), 3-10.
Chaffee, S. H., & Metzger, M. J. (2001). The end of mass communication? Mass
Cheney-Lippold, J. (2011). A new algorithmic identity: Soft biopolitics and the modulation
10.1177/0263276411424420.
Cormen, T. H., Leiserson, C. H., Rivest, R. L., & Stine, C. (2009). Introduction to
De Vries, K. (2010). Identity, profiling algorithms and a world of ambient intelligence. Ethics
10.1080/21670811.2014.976411.
Diakopoulos, N., & Koliska, M. (2016). Algorithmic transparency in the news media. Digital
Eslami, M., Rickman, A., Vaccaro, K., Aleyasen, A., Vuong, A., Karahalios, K., ... &
Sandvig, C. (2015, April). I always assumed that I wasn't really that close to [her]:
25
Reasoning about invisible algorithms in news feeds. In B. Begole & J. Kim (Eds.),
Giddens, A. (1991). Modernity and self-identity: Self and society in the late modern age.
Helberger, N. (2016). Policy implications from algorithmic profiling and the changing
relationship between newsreaders and the media. Javnost–The Public, 23(2), 188-203.
Himelboim, I., Gleave, E., & Smith, M. (2009). Discussion catalysts in online political
Horton, D., & Wohl, R. (1956). Mass communication and para-social interaction:
3-658-09923-7_7 .
0.1177/0162243915587360.
0.1002/9781118766804.wbiect186
Just, N., & Latzer, M. (2017). Governance by algorithms: reality construction by algorithmic
selection on the Internet. Media, Culture & Society, 39(2), 238–258. doi:
10.1177/0163443716643157
26
Karlsen, R. (2015). Followers are opinion leaders: The role of people in the flow of political
Katz, E., & Lazarsfeld, P. F. (1966). Personal influence: The part played by people in the
Kitchin, R. (2014). The data revolution: Big data, open data, data infrastructures and their
Education.
Luckerson, V. (2015, July 5). Here's why Facebook won’t put your news feed in
chronological-order/
27
Napoli, P. M. (2010). Revisiting “mass communication” and the “work” of the audience in
the new media environment. Media, Culture & Society, 32(3), 505-516. doi:
0.1177/0163443710361658.
0.1111/comt.12039.
Pariser, E. (2011). The filter bubble: What the Internet is hiding from you. New York:
Penguin.
Pasquale, F. (2015). The black box society: The secret algorithms that control money and
Rader, E., & Gray, R. (2015, April). Understanding user beliefs about algorithmic curation in
the Facebook news feed. In B. Begole & J. Kim (Eds.), Proceedings of the 33rd
Saurwein, F., Just, N., & Latzer, M. (2015). Governance of algorithms: Options and
28
Shane, S. (2017, September 7). The fake Americans Russia created to influence the election.
New York Times. Retrieved from
https://www.nytimes.com/2017/09/07/us/politics/russia-facebook-twitter-
election.html.
Soffer, O. (2010). “Silent orality”: Toward a conceptualization of the digital oral features in
CMC and SMS texts. Communication Theory, 20(4), 387-404. doi: 10.1111/j.1468-
2885.2010.01368.x.
Soffer, O. (2013). The internet and national solidarity: A theoretical analysis. Communication
Sundar, S. S., & Marathe, S. S. (2010). Personalization versus customization: The importance
of agency, privacy, and power usage. Human Communication Research, 36(3), 298-
Thompson, J. B. (1995). The media and modernity: A social theory of the media. Stanford:
Thorson, K., & Wells, C. (2016). Curated flows: A framework for mapping media exposure
2958.2010.01377.x.
29
Thurman, N., & Schifferes, S. (2012). The future of personalization at news websites:
10.1080/1461670X.2012.664341.
Turcotte, J., York, C., Irving, J., Scholl, R. M., & Pingree, R. J. (2015). News
recommendations from social media opinion leaders: Effects on media trust and
Vaccari, C., & Valeriani, A. (2013). Follow the leader! Direct and indirect flows of political
communication during the 2013 general election campaign. New Media & Society,
doi: 1461444813511038.
Vu, H. N. N., & Gehrau, V. (2010). Agenda diffusion: An integrated model of agenda setting
Walsham, G. (1997) Actor-network theory and IS research: Current status and future
prospects. In A.S. Lee, J. Liebenau, & J. I. DeGross (Eds.), Information systems and
Weimann, G. (1982). On the importance of marginality: One more step into the two-step flow
10.2307/2095212.
Winter, S., & Neubaum, G. (2016). Examining characteristics of opinion leaders in social
doi:10.1177/2056305116665858.
30
31