You are on page 1of 896

THE SOCIAL MEDIA PRIVACY

MODEL 2
Abstract
Privacy has been defined as the
selective control of information
sharing, where control is key. For
social media, however, an individual
user’s informational control has
become more difficult. In
this theoretical article, I review how
the term control is part of theorizing on
privacy, and I
develop an understanding of online
privacy with communication as the
core mechanism by which
privacy is regulated. The results of this
article’s theoretical development are
molded into a
definition of privacy and the social
media privacy model. The model is
based on four
propositions: Privacy in social media is
interdependently perceived and valued.
Thus, it cannot
always be achieved through control. As
an alternative, interpersonal
communication is the
primary mechanism by which to ensure
social media privacy. Finally, trust and
norms function as
mechanisms that represent crystallized
privacy communication. Further
materials are available at
https://osf.io/xhqjy/
Keywords: privacy, control, social
media, affordances, communication,
social media
privacy model, definition of privacy
THE SOCIAL MEDIA PRIVACY
MODEL 3
The Social Media Privacy Model:
Privacy and Communication in the
Light of Social Media
Affordances
In historical and current theories about
privacy, control has been perceived as
an
important defining term. The majority
of privacy scholars understand control
as the means by
which to regulate and finally
experience privacy (Altman, 1975;
Burgoon, 1982; Petronio, 2002).
The underlying assumption is that the
more users can control access to their
personal lives, or—
more technically—to their data, the
more privacy they experience. Also, the
most current
understanding held by social media
users is that they need control to
achieve privacy and
informational self-determination
(Marwick & boyd, 2014). The majority
of 80% to 90% of U.S.
Americans (Madden & Rainie, 2015)
and Europeans (European Commission,
2015) say that it is
important to them to be in control of
determining who can obtain
information about them and
what information is collected about
them (see also Sarikakis & Winter,
2017).
There is no question that users face
decreasing informational control while
communicating via social media. Due
to their networked nature, social media
applications do not
allow users to control what friends,
acquaintances, institutions, or
companies do with the
information, pictures, and stories that
are shared online (Marwick & boyd,
2014). Further, social
media communication takes place in
larger and larger parts of users’ lives.
All of the more
current applications and devices
aggregate information and exert
automatic control (Anderson,
Rainie, & Duggan, 2014). As a reaction
to the increasing amounts of data that
are exchanged and
the sociality of such data, 91% of users
perceive that they have lost control
over how their
personal information is collected and
used by friends, acquaintances, and
colleagues (Quinn,
2014) and especially by companies and
governments (Madden, 2014).
These two observations—the
understanding of privacy as control on
the one hand and the
experience of decreasing control over
information while using social media
on the other—can be
THE SOCIAL MEDIA PRIVACY
MODEL 4
termed a control issue of privacy. In the
remainder of this article, I will suggest
an understanding
of privacy that is adequate for social
media use and the requirements
emerging from this issue.
The Relationship of Privacy and
Control
Privacy is a concept that has been
considered and defined in very
different disciplines,
from descriptive, empirical, and
normative perspectives (Sevignani,
2016; Trepte & Reinecke,
2011). In earlier days, privacy was
considered a human right and identified
as the “right to be let
alone” (Warren & Brandeis, 1890, p.
75). Later and more specifically,
privacy was defined as
“the claim of individuals, groups, or
institutions to determine for themselves
when, how, and to
what extent information about them is
communicated to others” (Westin,
1967, p. 7) or “the
selective control of access to the self”
(Altman, 1975, p. 24).
Informational control has only seldom
been defined, but the most common
definitions
touch either a static or behavioral
aspect of control: Informational control
foremost means that
owners of a certain piece of
information have a choice over
whether, when, and to what extent
they will disclose or withhold personal
information (Crowley, 2017; Tavani,
2007). Here control
is static, a question of more or less, yes
or no. It can be understood as an option
or an available
mechanism. Then, control can be
exerted actively (e.g., restricting access
to information,
audience segregation, self-censorship,
encryption), ambiguously (e.g.,
softening the truth,
obfuscating information, or engaging
in other forms of partial disclosure), or
passively (e.g.
unintentionally omitting information)
(Crowley, 2017; Ochs & Büttner,
2018). In this rather
behavioral understanding,
informational control is executed and
experienced by the individual
person. In both perspectives, control is
centered around the individual and
individual decision
making.
The majority of privacy theories are
devoted to two—somewhat
contradictory—
paradigms: I will call the first
paradigm “privacy as control,” because
here, privacy and control
THE SOCIAL MEDIA PRIVACY
MODEL 5
are strongly connected, and, the second
paradigm “privacy and control,”
because here, privacy
and control are treated as separate
constructs with conditional
relationships. I will then suggest a
third perspective that redefines the
meaning and impact of control and the
conditions among
which control becomes relevant. This
perspective will be summarized in the
social media privacy
model.
Paradigm 1: Privacy as Control
In the seminal work by Altman (1975)
and the privacy regulation model of
self-disclosure
(Derlega, Metts, Petronio, & Margulis,
1993), control was set forth as the
crucial mechanism of
privacy. More recent
conceptualizations have also referred to
control as a precondition of privacy
(Petronio, 2002). Even in their very
first conceptualizations of privacy,
Warren and Brandeis
(1890) referred to privacy as the right
to control what others publish about
oneself. In an
overview of privacy theories, Smith,
Dinev, and Xu (2011) investigated 448
publications on
privacy. They found that—besides an
understanding of privacy as a value—
the cognate-based
understanding of privacy as control has
dominated the social sciences.
The vast majority of privacy scholars
have referred to control as a dynamic
behavior in
the process of privacy regulation to
grant access or to deny access. Altman
(1975) suggested a
process model with three steps: First,
an individual assesses the desired level
of privacy; then the
individual eventually regulates privacy
by controlling interpersonal
boundaries; and then, the
individual again assesses the achieved
level of privacy. In his flow-model the
crucial role that
was assigned to control becomes
apparent. On the basis of this notion,
Petronio (2002) articulated
how control is the engine of privacy
management. In her understanding, an
individual jointly
manages and coordinates rules with
others while interacting with them.
Here again, control is not
only the behavior through which
privacy can be gained, but control is
also the means by which to
measure the status quo of privacy, and
in turn, it will foster the extent to which
privacy regulation
THE SOCIAL MEDIA PRIVACY
MODEL 6
is further engaged in through an
exertion of control.
Privacy scholars have also referred to
the question of what is being
controlled. Here, in
particular, the control of access to
boundaries and the control of the flow
of an interaction were
addressed as the topics or processes
that needed to be controlled (Johnson,
1974; Wolfe &
Laufer, 1974). Further, control over
stimuli that impinge upon a person
were articulated as things
that need to be controlled (Wolfe &
Laufer, 1974). Margulis (1974)
explained that control refers
to all matters being exchanged between
individuals: “Privacy, as a whole or in
part, represents the
control of transactions between
person(s) and other(s)…” (p. 77).
In some theories, control has been used
almost interchangeably with privacy.
For
example, Derlega et al. (1993) stated
that “…privacy represents control over
the amount and kind
of information exchange that persons
have with one another” (p. 67). Then,
Burgoon (1982)
differentiated between four dimensions
of privacy, all of which refer to how
much control an
individual has: Possessing physical
privacy refers to whether and how
much control an individual
perceives to have over physical
boundaries. Social privacy refers to
how much control an
individual perceives to have over the
access of others to the person’s
environments.
Psychological privacy refers to how
much control an individual perceives to
have over emotional
and cognitive input and output. Finally,
informational privacy refers to how
much control an
individual perceives to have over the
use of personal data. In this
conceptualization, the ability to
exert control is the key to an
individual’s privacy perception and, in
turn, regulation. Many
empirical studies have addressed the
relationship between control and
privacy, but only a
minority of studies have supported the
notion that privacy behavior is related
to informational
control (Brandimarte, Acquisti, &
Loewenstein, 2013).
In sum, studies that have been based on
this first paradigm have underscored
the idea that
individuals exert control to achieve
privacy. In turn, privacy should be
achieved if a certain level
THE SOCIAL MEDIA PRIVACY
MODEL 7
of control is successfully executed and
maintained as the status quo. However,
these
conceptualizations of privacy suggest a
linear relationship between privacy and
control. They
assume that “…the more one has
control over this information exchange,
the greater the amount
of privacy one has in a social
relationship” (Derlega et al., 1993, p.
67). However, previous
empirical research did not find a linear
relationship between privacy and
control. Hence, there is
a mismatch between the theoretical
assumption that privacy and control are
closely related on the
one hand and the rare empirical data
supporting this notion on the other.
Paradigm 2: Privacy and Control
In social media, an individual person
cannot easily execute control because
personal
information is exchanged between
many parties and with a broad range of
applications. Users
experience social media as more
confusing, demanding, and complex
with regard to the control
that they have over their personal
information than face-to-face
communication (Marwick
& boyd, 2014; Quinn, 2014). Woo
(2016) expressed this confusion while
mimicking the
presumed thoughts of a user: “Please
do contact me and give me benefits, but
I still do not want
to fully give up my control (but I do
not know how to have that control)” (p.
954). In other
words, users want to take advantage of
the networked nature of social media,
are painfully aware
of the deficits in control, but have not
yet found solutions for how to embrace
their needs for both
gratification and informational control.
This process of weighing privacy risks
and social
gratifications has also been
investigated under the umbrella of the
privacy calculus (Trepte et al.,
2017).
To follow up on the sentiment that
users wish to have informational
control but that
control seems to contradict the
networked nature of social media,
Moor (1997) and later Tavani
(2007) reflected on the relationship
between privacy and control. They
argued that control and
privacy should be seen as separate
constructs and that privacy and control
serve very different
THE SOCIAL MEDIA PRIVACY
MODEL 8
functions. With the Restricted
Access/Limited Control (RALC) theory
of privacy, these authors
defined privacy in terms of the
individual’s protection from intrusion
and information gathering
by third parties. They argued that
control in the information age is
impossible and further that
“We can have control but no privacy,
and privacy but no control” (Tavani &
Moor, 2001, p. 6).
They suggested that privacy and control
should be separated such that privacy is
a concept and a
value that is defined by being protected
from information access by others,
whereas control is one
mechanism that can be used to manage
and justify privacy. Control may be
exerted through
choice, consent, or correction. In the
flow of the exchange of digital
information, people choose
situations according to their
communication goals, level of access,
and emerging privacy needs
(Trepte & Masur, 2017); then, privacy
is maintained through the processes of
consent, and
finally, corrections allow people to
restore their privacy when it gets lost or
threatened. For the
upcoming social media privacy model,
I will refer to this notion that control is
one mechanism
among others, and I will explain that
for all processes (i.e., choice, consent,
correction),
individuals have to get in touch with
others and communicate their motives
and aims.
With her theory of contextual integrity,
Nissenbaum (2010) also addressed the
contextual
requirements as boundary conditions,
regardless of whether control is a
functional mechanism or
not. She suggested that the two sets of
theories be married: those referring to
privacy as a
constraint on access and those referring
to privacy as a form of control. In her
theory of
contextual integrity, Nissenbaum
(2010) described control as one
“transmission principle” (p.
145) that defines how information is
exchanged. Other transmission
principles are reciprocity and
confidentiality. Control as a
transmission principle is appropriate
only if it fits into the particular
context, the subject that users are
talking about, the type of information
that is to be exchanged,
and the actors they communicate with.
From this point of view, there can be
privacy without
control in situations in which control is
inappropriate or not available (Laufer
& Wolfe, 1977;
THE SOCIAL MEDIA PRIVACY
MODEL 9
Slater, 2007).
Current privacy theories pushed the
idea of control as a dynamic attribute of
the situation
one crucial step further. According to
Dienlin’s (2014) privacy process
model, individuals assess
the controllability of the context and
behavior. Masur (2019) adds an
analysis of what is being
controlled by entangling interpersonal
(e.g., the relationship between
interaction partners) and
external factors (e.g., the architecture of
a room) in his theory of situational
privacy and self-
disclosure. Depending on the situation,
these interpersonal and external factors
can be controlled
to different degrees, and in turn, they
can elicit differential levels of self-
disclosure. Self-
disclosure will be understood as “the
intentional communication of
information about the self to
another person or group of people”
(Masur 2019, p. 70) in the remainder of
this article.
The notion that privacy and control are
not necessarily connected has been
supported by
previous research (Saeri, Ogilvie, La
Macchia, Smith, & Louis, 2014). For
example, Zlatolas,
Welzer, Heričko, and Hölbl (2015)
demonstrated that privacy norms,
policies, and awareness but
not privacy control were related to the
self-disclosures of N = 673 Slovenian
Facebook users. In
a U.S. sample of N = 249 Facebook
users, Taneja, Vitrano, and Gengo
(2014) found that
perceived behavioral control and the
intention to engage in privacy-related
behavior were
unrelated. Eastin et al. (2016)
investigated how different variables
predicted mobile commerce
activity and found that control was the
one that explained the smallest amount
of variance. In
particular, trust and attitude toward
mobile commerce were more important
predictors than
control. In sum, individuals who
perceived that they had control over
their personal data did not
necessarily feel they had more privacy
and did not increasingly engage in self-
disclosure. Further,
trust and norms were identified as
important alternative mechanisms of
privacy (Brandimarte et
al., 2013; Eastin et al., 2016;
Nissenbaum, 2010; Zlatolas et al.,
2015). I will refer to both
findings in the social media privacy
model.
THE SOCIAL MEDIA PRIVACY
MODEL 10
The Interplay of Affordances, Control,
and Privacy
The lack of a relation between privacy
and control might hint that the interplay
of the two
variables is not linear and is actually
more complex (Laufer & Wolfe, 1977).
The relation
between control and privacy should
become clearer if the social media
boundary conditions that
make control a functional mechanism
in one situation but impractical in
another are elucidated.
Social Media and its Boundary
Conditions for Privacy
Carr and Hayes (2015) defined social
media as “…Internet-based channels
that allow
users to opportunistically interact and
selectively self-present, either in real-
time or
asynchronously, with both broad and
narrow audiences who derive value
from user-generated
content and the perception of
interaction with others” (p. 50). They
further pointed out that users’
interaction will increasingly be
influenced by social media affordances.
Further, social media has
been characterized by its content, its
users, and its infrastructure in previous
definitions (Howard
& Parks, 2012). The most prominent
examples of social media are social
network sites (e.g.,
Facebook, Instagram, LinkedIn,
Google+), multimedia platforms (e.g.,
Youtube, Slideshare,
Soundcloud), weblogs (e.g., personal
diaries of mothers, scholars, or self-
appointed or paid
influencers), and microblogs (e.g.,
Twitter). In struggling to develop a
definition of social media,
scholars have pointed to the fact that
social media channels are formally
understood as methods
of mass communication but that they
primarily contain and perpetuate
personal user interactions
(Carr & Hayes, 2015; Papacharissi,
2010). In this sense, social media can
be referred to as
personal publics (Schmidt, 2014). As a
consequence, users cannot always
clearly define the
somewhat blurred lines between
personal and public or between private
and professional
communication. They feel that contexts
collapse and converge (Papacharissi,
2010; Vitak, 2012).
In sum, social media is characterized
by the following boundary conditions:
the content, its flow
and further uses (Howard & Parks,
2012); the communication practices
that users perceive as
THE SOCIAL MEDIA PRIVACY
MODEL 11
their options for exerting control or for
achieving privacy with other means;
and social media
affordances (Carr & Hayes, 2015). In
the following, I will analyze of how the
interplay of these
boundary conditions is related to
control and how it determines different
privacy perceptions and
behaviors. Tables 1 and 2 in the
Supplemental summarize this
theoretical development.
Social Media Boundary Condition 1:
Content, its Flow and Uses
What exactly does an individual strive
to control? The sooner we come to
understand
what individuals strive to control, the
better we can evaluate whether control
can be experienced
in social media. According to most
studies, personal information refers to
the content that people
strive to control in order to maintain
their privacy in social media. Metzger
(2004) referred to
personal information as the content to
be controlled. Quinn (2014) suggested
different layers of
how privacy can be maintained. On the
“content layer,” users’ experience of a
lack of control
leads them to limit the information they
post or even to post false information.
Sarikakis and
Winter (2017) added on the basis of
their qualitative work that users do not
differentiate between
personal information and personal data.
Instead, they define the degree of
intimacy or privacy
needed for a certain piece of
information or data.
Then, besides personal information, the
flow and use of the content needs to be
considered. Social media advocates
specifically address where online
information is forwarded,
archived, and sold. They emphasize
users’ concerns about how little control
they have over the
flow and use of personal information
(Marwick & boyd, 2014; Quinn, 2014;
Tsay-Vogel,
Shanahan, & Signorielli, 2018). This
refers to the forms personal
information takes, to where it
ends up and how it is used. In the
following, personal information, its
flow, and further uses will
be considered as what users strive to
control.
Social Media Boundary Condition 2:
Practices of Control
Actively exerting control expresses
informational self-determination,
which implies
THE SOCIAL MEDIA PRIVACY
MODEL 12
having power over information and
agency in decisions regarding this
information. In turn, a loss
of control would mean that other
behavioral options are out of reach and
that individuality (Buss,
2001), power, and agency are severely
threatened (Brummett & Steuber,
2015). Control also
comes along with risk avoidance:
Users have identified the most
important pieces of information
that they want to control as the contents
of their emails, the contents of their
online chats, and
their location (Cho, Lee, & Chung,
2010). As long as they have control
over this information,
they can avoid being harassed, bullied,
financially exploited by companies, or
surveilled by
governmental institutions.
How is control executed and achieved?
First, informational control can be
identified as an
individual’s perception that he or she
has a choice about whether to withhold
or disclose
information (Crowley, 2017; Johnson,
1974). Choice is the first step and
determines whether
control can be exerted and to what
degree (Wolfe & Laufer, 1974). Then,
in the next step, when
choice is available, it has to be put into
action. Two active practices of control
in social media are
consent and correction (Tavani, 2007).
Consent refers to the extent to which
users agree that a
certain piece of information will be
passed along. Correction means that
users are able to
withdraw from this agreement.
Whereas choice was identified long
ago as a practice of control
(Johnson, 1974), consent and correction
were suggested as social media
practices (Tavani, 2007).
Further, informational control can also
be put into practice by the selective
sharing of
information, self-censorship, audience
segregation, and encryption (Ochs &
Büttner, 2018). All
of these options are directed by the
individual and can be considered to be
ego-centered. It will be
important to also find terms for
interpersonal privacy regulation
behaviors.
Social Media Boundary Condition 3:
Affordances
Social media can be characterized by
affordances. The term affordance,
initially
suggested by Gibson (1979/2014),
means that the environmental
characteristics of a certain entity
THE SOCIAL MEDIA PRIVACY
MODEL 13
are not static but differently perceived,
experienced, and as such shaped by
humans. In the case of
social media, this understanding is
more than suitable. Of course the
process of shaping or
“furnishing” (Gibson, 1979/2014, p.
78) social media influences its further
uses. For example,
teenage users regulate their privacy
through social steganography, an
idiomatic language that is
understood only by their peers but not
their parents instead of managing their
audiences by
editing their friends lists and
systematically blocking their parents or
certain peers (boyd, 2014).
Inventing and using this kind of
idiomatic language might influence
users’ style of
communication and interactions on
social media. A selection of four
affordances have repeatedly
been shown to be particularly important
for social media: anonymity,
editability, association, and
persistence (boyd, 2008; Evans, Pearce,
Vitak, & Treem, 2017; Treem &
Leonardi, 2012). The
affordances of anonymity and
editability allow users to exert control.
By contrast, the affordances
of association and persistence
challenge users’ ability to exert
control. Both clusters of
affordances—those enhancing (Table
1) as well as those challenging control
(Table 2)—have
different implications for how content
is controlled, which practices of
control are available, and
how they affect privacy regulation in
social media realms. In the following, I
intertwine the
results from previous research on
privacy, the content and practices of
control, and, affordances.
Anonymity. The affordance of
anonymity describes the idea that other
social media
agents such as other private people,
institutions, or companies do not know
the source of the
message or the sender (Evans et al.,
2017). For social media use, being
unknown to others and
anonymously using social media is rare
(Rainie, Kiesler, Kang, & Madden,
2013). Nevertheless,
anonymity is still appreciated
occasionally. For example, users
commenting on other users’ posts
in self-help or political forums can
decide to keep their interactions
anonymous. In addition, users
of dating websites might at least partly
or temporarily use such sites
anonymously (Ramirez,
Bryant, Erin, Fleuriet, & Cole, 2015).
Anonymity is not a question of “on” or
“off” but is flexible
THE SOCIAL MEDIA PRIVACY
MODEL 14
and scalable (Evans et al., 2017).
Anonymity has spurred enormous
attention in research on
computer-mediated communication
(CMC; Joinson, 2001). Here,
anonymity is specifically
considered a means to control the
access of certain individuals (Qian &
Scott, 2007). Further,
while being online anonymously, an
individual can deliberately decide what
information to share,
with whom to share it, and what to
withhold (Qian & Scott, 2007). While
being online
anonymously, the receiver of a
message cannot provide the CMC with
face-to-face behaviors,
and as such, the control lies in the
hands of the sender (Ben-Ze'ev, 2003).
Control over content, its flow, and its
uses is possible because, in a state of
anonymity, all
of these are disconnected from the user.
Although full anonymity is not afforded
by social media,
U.S. American users occasionally keep
their posts anonymous while using
social media with the
clear aim of exerting control (Rainie et
al., 2013). For example, they disguise
their location or
delete cookies so that companies
cannot identify them. An interview
partner in Sarikakis and
Winters’ (2017) study said: “Well
when I use fake names or email
addresses and fake birthdates I
think that’s the only control you can
try to have” (p. 9). Two aspects,
though, might mitigate the
perception of control. First, users know
that they leave traces behind, and once
they have posted
content online—even if it was posted
anonymously—they might be traceable
because of other
information they left online; second,
anonymity is usually used only to some
extent (e.g., by
leaving one’s name but not one’s
address), and users acknowledge that
with partial anonymity,
they experience only partial control,
and in turn, only partial privacy (Rainie
et al., 2013). What
are the available practices users have to
exert control? First, being online
anonymously can be
considered a question of choice. Woo
(2016) suggested that anonymity—or
by contrast,
identifiability—is the most important
issue in online privacy. He argued that
on the social web,
users should have “islands” of
anonymity. He has even encouraged
people to lie and to have
secrets with the aim of regaining
control and autonomy. Then, however,
in an anonymous setting,
THE SOCIAL MEDIA PRIVACY
MODEL 15
consent and corrections are somewhat
disconnected from the user as these are
identity-related
practices.
Anonymity can be an enactment of
control by disguising or lying about
one’s identity or
by leaving it unspecified for some
applications and occasions. Also,
people may leave the source
of their own messages unknown. And,
of course, these enactments of control
might be applied
when interacting with some users but
not with others. In sum, the affordance
of anonymity is
related to informational control, which
has also been demonstrated in
empirical studies (Fox &
McEwan, 2017).
In previous research on privacy,
anonymity has played a crucial role.
Here, it was even
understood as a “type” of privacy
among other types such as solitude,
intimacy, or reserve
(Pedersen, 1999; Westin, 1967). In
sum, anonymity allows users to exert
control and, in turn, it
increases an individual’s subjective
experience of privacy by not being
identifiable at all or by
selectively presenting parts of one’s
own identity (Smith et al., 2011; Woo,
2016).
Editability. While using social media,
users interact remotely in terms of time
and space.
This gives them the chance to edit their
online communications before and
after the
communications are seen by others
(Treem & Leonardi, 2012). Editability
is an affordance that
was previously addressed as important
for CMC in the hyperpersonal model
(Walther, 1996):
Senders of online messages self-
selectively present themselves online
by primarily transmitting
cues and messages that they want
others to get to know and that put them
in a positive light. In
addition, although editing one’s identity
is part of any social interaction, social
media platforms
offer more freedom in terms of what,
how, and when these interactions are
edited. Editing allows
users to rehearse, change, package, or
literally craft a message and, in turn, to
rehearse, change,
and craft their personal appearance.
Editability allows the message sender
control over content and its flow and
uses because
THE SOCIAL MEDIA PRIVACY
MODEL 16
users have the chance to ponder the
consequences of their posts (Treem &
Leonardi, 2012).
Further, users may control the flow and
further use of their content by
articulating the lists of
online friends that they connect with or
by using a private or public profile
(Ellison & boyd,
2013). The availability of control
practices is highly supported by social
media’s affordance of
editability (Fox & McEwan, 2017).
Users have a choice to either intuitively
post their thoughts or
to pictorially represent their nonverbal
cues. Editing can be considered an
active enactment of
control because users deliberately
choose what to reveal to certain
audiences or what to withhold
(Crowley, 2017). Further, corrections
of one’s own posts and decisions are
possible and can also
be conceived as an enactment of
control (Crowley, 2017).
Control over the flow of information
might also foster subjective
experiences of privacy. In
privacy research, exerting control over
the flow of an interaction was often
understood
synonymously with control or as a
transmission principle that guaranteed
privacy (Nissenbaum,
2010).
Association. Social media platforms are
primarily used because they offer users
the chance
to connect with others and to stay in
touch. Users have articulated the idea
that communication is
their most important motive for using
social network sites (Quinn, 2016).
Consequently, the most
important affordance of social media is
the associations that are created or
maintained between
interaction partners (Ellison & boyd,
2013; Treem & Leonardi, 2012).
The affordance of association and the
chance to exert control over content, its
flow and
uses seem strikingly incompatible. In
their cross-sectional study of N = 875
Mechanical Turk
workers, Fox and McEwan (2017)
demonstrated that informational
control and network
association were negatively related.
Control is exerted by an individual
person, and as such, the
individual person is at the center if not
entirely responsible for achieving
privacy via control. This
is clearly expressed in users’ current
understanding of control. For example,
participants in
THE SOCIAL MEDIA PRIVACY
MODEL 17
Sarikakis and Winters’ (2017) study
identified the individual as the
legitimate “controller” of
privacy (p. 6). Also, boyd (2014)
argued that exerting control with the
aim of achieving privacy
requires the individual’s full
consideration, power, knowledge, and
skills. The only control
practice available would be to
completely withdraw (i.e., not to
participate) and to accept the
disadvantages that come along with
such a decision. Then, ambiguous and
passive enactments of
control might be used, but these have
the same disadvantages.
In sum, control as an issue and a
concept takes the perspective of the
individual. In other
words, it is an “ego-centered need”
(Papacharissi, 2010, p. 144). By
contrast, association is an
interindividual concept. Very likely,
one person’s control ends at the point
where another
person’s control starts. Hence, as much
as the social web is an interdependent
realm involving
other parties, control is not the means
to ensure or exert privacy. On the basis
of these
considerations—and this will be
important for the upcoming
propositions on social media
privacy—other means and mechanisms
to guarantee the subjective experience
of privacy have
become necessary: Users communicate
with each other to ensure privacy. And
further, if
communication is not possible, they
choose communication partners—
individuals, organizations,
institutions—that they can trust. Trust
has been shown to be crucial for
ensuring the perception of
privacy and subsequent online
disclosure (Metzger, 2004). Trust can
be established by personal
communication (Petronio, 2002) and by
norms that the individual user can rely
on (Saeri et al.,
2014). As such, in the upcoming
propositions and the social media
privacy model, trust,
communication, and norms are
conceptualized as the core mechanisms
to ensure privacy beyond
control.
Persistence. The affordance of
persistence addresses the durability of
online expressions
and content (boyd, 2014) and the idea
that after personal information is
published online, it is
automatically recorded and archived
and is consequently replicable (boyd,
2008). It means that
THE SOCIAL MEDIA PRIVACY
MODEL 18
data remain accessible in the same
form over long periods of time and for
diverse and unforeseen
audiences (Evans et al., 2017; Treem &
Leonardi, 2012). Some authors have
emphasized the
positive outcomes that persistence may
have, namely, that it allows knowledge
to be sustained,
creates robust forms of
communication, establishes the growth
of content (Treem & Leonardi,
2012), and allows for authenticity and
longevity (boyd, 2008).
However, as persistence comprises the
endless and infinite nature of online
data, it also
expresses a loss of informational self-
determination. Persistence seems
incompatible with
control. It evokes the idea that control
over content, its flow, and uses is
impossible because once
personal information is posted online,
it is no longer under the sender’s
control. With regard to
control practices, social media users
have the choice to completely
withdraw from online
interactions, to not post their
information, and thus to avoid its
persistence. At the same time, this
would prevent them from receiving the
myriad benefits that come along with
data sharing and
thus does not seem to be a question of
freedom of choice anymore. Also, as
soon as the choice is
made to participate, other control
practices such as consent and correction
significantly decrease.
Finally, once given, consent decreases
a person’s chances to subsequently
correct previous online
communication. Users know that they
do not have control over how persistent
their data will be,
and this significantly decreases their
subjective experience of privacy
(Rainie et al., 2013). The
lack of control over personal
information due to the persistence of
all kinds of online information
can be considered one of the key issues
of online lives and the subjective
experience of privacy
today. To react to users’ needs to
foresee and understand persistence
(Rainie et al., 2013),
communication, trust and norms seem
to be adequate ways to ensure privacy.
The Social Media Privacy Model
Social media privacy is based on
interpersonal processes of mutual
disclosure and
communication (Altman, 1975;
Petronio, 2002). Further, it can be
considered a value that is co-
THE SOCIAL MEDIA PRIVACY
MODEL 19
developed by engaging in
communication and that is expressed
by a shared perception
(Nissenbaum, 2010; Smith et al.,
2011). On the basis of these
considerations, I propose:
Proposition 1: Privacy is
interdependently perceived and valued.
In contrast to privacy, control is at the
center of the individual person. Control
is exerted
by the individual person, and it can be
considered to be ego-centered
(Papacharissi, 2010;
Sarikakis & Winter, 2017). Social
media platforms aim for
connectedness, interdependence, and
sociality and can be described as
social-centered (Ellison & boyd, 2013).
As a consequence,
social media privacy cannot be
sufficiently achieved by exerting
control. Other mechanisms are
necessary to ensure social media
privacy.
Proposition 2: Privacy cannot be
satisfactorily achieved by exerting
control in social media.
Instead of striving for control as an
end-game of privacy, the opposite is
necessary to
experience privacy in social media.
Users need to constantly communicate
with each other as
well as with institutions and companies
to ensure their privacy. They need to
engage in both
interpersonal and deliberative
communication processes.
Interpersonal communication is
understood as interactions between
users as well as interactions between
the individual user and
others who represent third parties such
as institutions and companies.
Deliberation is defined as
either informal or institutionalized
interaction among internet users (and
eventually
representatives of governments,
institutions, or companies), involving
rational-critical decision
making and the earnest aim to find a
solution (Burkhalter, Gastil, &
Kelshaw, 2002).
Proposition 3: Interpersonal
communication is a mechanism by
which social media privacy can
interdependently ensured and put into
effect.
However, not all actions and steps of
online media use can be accompanied
by
communication processes. For many if
not most questions of privacy, people
can rely on past
experiences. Here, communication and
deliberation crystallize into a stable
relationship-based
THE SOCIAL MEDIA PRIVACY
MODEL 20
result or even solution, i.e. trust (or
mistrust) and norms (or anomia). Trust
can be defined as an
anticipation and expectation of how a
person or institution will behave and as
such reduces
uncertainty (Waldman, 2018, p. 4).
Trust has been shown to ensure privacy
on the basis of
longstanding communication and
reliable bonds (Saeri et al., 2014). Trust
can be conceived as
both a crucial factor of influence in
decisions over self-disclosure and as a
result of
communication (Saeri et al., 2014). In
turn, mistrust—which has not yet been
addressed in
privacy research—is a menace to
privacy and as such should initiate
communication. In a
qualitative study on privacy
perceptions, Teutsch, Masur, and
Trepte (2018) demonstrated that
participants perceived that they had lost
control and had substituted trust for
control. One of the
interview partners said, “Well, privacy
is absolute trust between conversational
partners and
absolute, absolute certainty that the
subject of conversation will stay within
this sphere” (p. 7).
Eichenhofer (2019) suggested that the
“trust paradigm” should point toward a
more current
perspective on privacy regulation via
trust in contrast to privacy regulation
via control or self-
determination.
In the case of privacy regulation, both
social and legislative norms come into
play (Gusy,
2018; Spottswood & Hancock, 2017;
Utz & Krämer, 2009). Social norms are
understood as
social pressure to engage in a certain
kind of behavior and are established by
what others approve
of (injunctive norms) and what they
actually do (descriptive norms) (Saeri
et al., 2014). Although
legislative norms are coined by
jurisprudence, regulated by law (and
not on the basis of observing
others), they share with social norms in
that they prescribe a certain behavior
and that they allow
for sanctions in case this behavior is
not shown. Previous research has
shown that trust, and
norms are the keys to obtaining privacy
(Marwick & boyd, 2014; Quinn, 2014).
To establish trust
and norms, of course, communication
is necessary.
Proposition 4: Trust and norms
function as privacy mechanisms that
represent
THE SOCIAL MEDIA PRIVACY
MODEL 21
crystallized privacy communication.
I suggest that control and
communication have found a new
balance in social media
communication: Control is losing
control and communication is gaining
power. In other words,
users do not solely rely on and strive
for control in the first place but strive to
communicate about
privacy to establish norms and trust
and even sometimes to regain control.
In fact, privacy’s
interdependence is expressed and put
forward by interpersonal
communication. This emerging
social turn in privacy theory is also
acknowledged in the definition of
privacy.
I define privacy by an individual’s assessments
of (a) the level of access to this
person in an interaction or relationship with
others (people, companies,
institutions) and (b) the availability of the
mechanisms of control,
interpersonal communication, trust, and norms
for shaping this level of access
through (c) self-disclosure as (almost intuitive)
behavioral privacy regulation
and (d) control, interpersonal communication,
and deliberation as means for
ensuring (a somewhat more elaborated)
regulation of privacy. In social media,
then, the availability of the mechanisms that
can be applied to ensure privacy
are crucially influenced by the content that is
being shared and the social media
affordances that determine how this content is
further used.
In the following, I will summarize the
theoretical rationale developed in this
article in the
chronology of a communicative
process. Further, I will show how the
individual’s privacy
assessments referred to in the four
propositions eventually lead to different
forms of privacy
regulation behaviors. The process is
illustrated in Figure 1. The following
steps are meant to
make the model accessible for
empirical investigation.
The first part of the flowchart refers to
the social media user’s subjective and
initial
assessment: All humans have
individual levels of access they
perceive as being more or less
adequate and comfortable. This
dispositional level of access is a
quantifiable dimension varying
THE SOCIAL MEDIA PRIVACY
MODEL 22
between high and low levels and
expressing the individual’s
dispositional willingness to self-
disclose. In contrast, the
communication goal is rooted in the
situation and defines what is to be
achieved in this particular situation.
The individual’s communication goals
in social media are
manifold and important to consider
when assessing online privacy. They
can most likely be
understood as a qualitative scenario of
what the individual user wants and
needs to communicate
in a certain situation. Hence, the point
of departure for each and any
consideration about privacy
are the more or less consciously asked
questions: How do I feel, what do I
need, and what is my
goal for this particular situation?
The second part of the model refers to
the social media boundary conditions
that are
encountered. Here, content and
affordances dynamically interact (as
indicated with the
multiplication sign) with an
individual’s initial assessment. The
individual weighs the ideal level
of access and his/her communication
goals against social media boundary
conditions by
considering what content is shared,
where; how it might flow from one user
or institution to
another; and, how it might be used.
Social media content becomes dynamic
as it is displayed and
shared. Affordances represent this
dynamic and, together with the
individual’s dispositions and
goals, shape the available privacy
mechanisms: control, trust, norms, and
interpersonal
communication. Hence, users assess
whether they have a choice, whether
they can rely on trust or
norms, or whether they will (have to)
engage in interpersonal
communication.
The third part of the model refers to the
subjective experience of privacy. The
individual
experiences a certain level of access
that results from the individual’s goals
on the one hand and
the social media boundary conditions
and privacy mechanisms that are
applied to actively
regulate privacy on the other. This
experience is here understood as the
rather unfiltered
accumulation of external stimuli and
internal needs and, then, results in a
more elaborated re-
assessment, i.e. the privacy perception
that can be verbalized, and is
empirically accessible.
THE SOCIAL MEDIA PRIVACY
MODEL 23
The privacy perception results in
different forms of privacy regulation
behaviors. First,
self-disclosure belongs to the most
intuitive regulation behaviors and
includes all information
intentionally shared (or not shared)
with others. And, for the case that the
privacy mechanism of
control is available and considered
adequate for a certain communication
goal, users exert control
actively and intentionally by restricting
access to information, audience
segregation, self-
censorship, encryption; or, rather
ambiguously by softening the truth,
obfuscating information.
When the privacy mechanisms do not
allow the deliberate and somewhat
egocentric privacy
regulation (i.e. when individuals do not
have at least partial control), other
regulation behaviors
come into play. The individual might
engage in interpersonal communication
or even deliberation
to negotiate and interdependently
regulate privacy. Interpersonal
communication, deliberation,
and control are meta-level regulation
behaviors that come into play when
privacy behaviors are
not intuitive, when contexts collapse, or
when a certain situation demands
further elaboration
and/or communication.
The communication process shown in
Figure 1 will take turns in a constant
flow of
assessments and re-assessments as soon
as the reactions of others lead to
changes in conditions or
when personal goals or needs change.
In what follows, I will discuss the
capabilities and
consequences of the model’s theoretical
propositions. What are the possible
effects and what are
the pitfalls and opportunities that will
occur if communication, trust, and
norms are substituted
for control?
Challenging the Social Media Privacy
Model
Social media use is at the heart of
human communication and offers all of
its many merits
such as practicing freedom of speech or
reaching out to people in critical living
conditions and
providing them with social support. In
addition, social media communication
offers particular
benefits because it is ubiquitous and
independent of time and location. As
such, for example, it
THE SOCIAL MEDIA PRIVACY
MODEL 24
allows people to communicate across
borders into the lands of friends, foes,
and fiends. All of
these merits of online communication
give its users enormous freedom. This
freedom is—and
this is again one of the major merits of
social media communication—often
independent of the
domestic or cultural situation of the
user. Freedom is historically closely
linked to privacy. In
mediaeval times, only those who had
land were free (Moore, 1984). And
only those who
possessed land had the freedom to
withdraw, grant, or restrict access to
their land. In turn, the
“unfree,” who did not have their own
land or possessions, did not have the
right to privacy. In
this sense, freedom is the origin both in
legislation and the genealogy of privacy
(Gusy, 2018).
For social media and online realms,
freedom and privacy are closely
connected. However,
the ideas of possessions and ownership
do not seem fully applicable anymore.
Borders and
possessions have become fuzzy
because, very often, data and personal
information are perceived
as shared goods as soon as they appear
online. Nissenbaum (2010)
summarized her work on
contextual integrity with the words:
“We have a right to privacy, but it is
neither a right to control
personal information nor a right to
have access to this information
restricted” (p. 231). She
conceded that for social media, privacy
is rather a value and a perception.
Borders and possessions have a
permanent nature. By contrast, values
and perception are
subject to interpretation and change.
This understanding of privacy as
subject to interpretation
and change makes it a matter of
communication. Eventually,
communication about privacy will
result in trust, and if successfully
shared in a society, in social and
legislative norms. However,
communication cannot be understood
as a long and painful process that will
finally show the
solution and lead us into the light of
privacy. Just the opposite is the case. In
social media,
communication about data sharing and
the use and flow of interpersonal data
is the solution itself.
Only due to ongoing and critical
assessment, reassessment, and dynamic
communication will we
have the chance to ensure privacy as
one of the most important values of
civilized societies.
THE SOCIAL MEDIA PRIVACY
MODEL 25
Privacy’s interdependence is expressed
and put forward by interpersonal
communication.
And in fact, privacy is experiencing a
“social turn”, in social media and
beyond (Helm &
Eichenhofer, 2019). This emerging
social turn in privacy theory is
acknowledged in the social
media privacy model. However, there
are downsides to a conception of
privacy as a
communicative process. First, not all
members of this communicative
process will have equal
chances of being heard. For example,
digital and literacy gaps have been
shown to crucially
influence participation in these
communication processes (Helsper,
2017).
Second, online information has become
a commodified good, and financial
interests are
stark (Sevignani, 2016). Purposeless
communication among friends is
increasingly exploited for
economic reasons (Seubert & Becker,
2019); and, companies do their best to
avoid interactions
between individual users who strive to
regulate their privacy. In turn, they try
to establish trust
through strong branding activities that
have been shown to override social
media users’ privacy
concerns and their interest in solving
privacy issues by actively
communicating and participating
(Boerman, Kruikemeier, & Zuiderveen
Borgesius, 2018; Li, 2011).
Third, as a consequence, the
requirements of communication and
trust demand a lot from
users. Control implies a settled state in
which the individual person can lie
back and stop thinking
about the flow of personal information
online. Communication, trust, and
norms, by contrast, are
subject to change and thus require
constant assessment and consideration.
Hence, information
control should also be considered with
regard to an individual’s ability to
exert control (Grimm &
Bräunlich, 2015). The users-centered
perspective needs to be complemented
and accompanied
with a system based perspective and
respective interventions (Schäwel,
2020).
Fourth, this demanding process of
communication might also result in a
threat to the
ability to develop a self-determined
identity. Westin (1967) held that
boundary control means
identity control: “This deliberate
penetration of the individual’s
protective shell, his
THE SOCIAL MEDIA PRIVACY
MODEL 26
psychological armor, would leave him
naked to ridicule and shame and would
put him under the
control of those who knew his secrets”
(p. 33). As a consequence, lost control
would mean
threats to identity development.
Finally, I embrace the demanding and
somewhat stressful nature of
communicating about
privacy in social media. In social
media, there is only limited control
over personal information.
In addition, the handling of this lack of
control is demanding and stressful. By
acknowledging
these two observations, users will
acknowledge that they need to take
action, engage in
communication, and establish trust and
shared social and legislative norms. In
social media,
privacy is not a private affair. It is at
the center of communication. We are
out of control because
we have so much to share. Hence,
interpersonal communication, trust,
and norms are the three
most important mechanisms that
interdependently help to ensure social
media privacy.
THE SOCIAL MEDIA PRIVACY
MODEL 27
References
Altman, I. (1975). The environment
and social behavior: Privacy, personal
space, territory,
crowding. Monterey, CA:
Brooks/Cole Publishing Company.
Anderson, J., Rainie, L., & Duggan,
M. (2014). Digital life in 2025.
Retrieved from
http://www.pewinternet.org/
2014/03/11/digital-life-in-2025/
Ben-Ze'ev, A. (2003). Privacy,
emotional closeness, and openness in
cyberspace. Computers
in Human Behavior, 19(4), 451–567.
https://doi.org/10.1016/S0747-
5632(02)00078-X
Boerman, S. C., Kruikemeier, S., &
Zuiderveen Borgesius, F. J. (2018).
Exploring
motivations for online privacy
protection behavior: Insights from
panel data.
Communication Research, 25.
https://doi.org/10.1177/009365021880
0915
Boyd, d. (2008). Taken out of context.
American teen sociality in networked
publics (Doctoral
dissertation). University of California,
Berkeley.
Boyd, d. (2014). It's complicated. The
social lives of networked teens. New
Haven, CT: Yale
University Press.
Brandimarte, L., Acquisti, A., &
Loewenstein, G. (2013). Misplaced
confidences: Privacy and
the control paradox. Social
psychological and personality science,
4(3), 340–347.
https://doi.org/
10.1177/1948550612455931
Brummett, E. A., & Steuber, K. R.
(2015). To reveal or conceal? Privacy
management
processes among interracial romantic
partners. Western Journal of
Communication, 79(1),
22–44.
https://doi.org/10.1080/10570314.201
4.943417
Burgoon, J. K. (1982). Privacy and
communication. Communication
Yearbook, 6(4), 206–
249. https://doi.org/10.1080/23808985
Burkhalter, S., Gastil, J., & Kelshaw,
T. (2002). A conceptual definition and
theoretical model
of public deliberation in small face-to-
face groups. Communication Theory,
12(4), 398–
422.
https://doi.org/10.1093/ct/12.4.398
Buss, A. (2001). Psychological
dimensions of the self. Thousand
Oaks, CA: Sage.
THE SOCIAL MEDIA PRIVACY
MODEL 28
Carr, C. T., & Hayes, R. A. (2015).
Social media: defining, developing,
and divining. Atlantic
Journal of Communication, 23(1), 46–
65.
https://doi.org/10.1080/15456870.201
5.972282
Cho, H., Lee, J.-S., & Chung, S.
(2010). Optimistic bias about online
privacy risks: Testing
the moderating effects of perceived
controllability and prior experience.
Computers in
Human Behavior, 26, 987–995.
https://doi.org/10.1016/j.chb.2010.02.
012
Crowley, J. L. (2017). A framework
of relational information control: A
review and extension
of information control research in
interpersonal contexts.
Communication Theory, 27(2),
202–222.
https://doi.org/10.1111/comt.12115
Derlega, V. J., Metts, S., Petronio, S.,
& Margulis, S. T. (1993). Self-
disclosure. Sage series
on close relationships. Newbury Park,
CA: Sage Publications.
Dienlin, T. (2014). The privacy
process model. In S. Garnett, S. Halft,
M. Herz, & J. M.
Mönig (Eds.), Medien und Privatheit
[Media and privacy] (pp. 105–122).
Passau,
Germany: Karl Stutz.
Eastin, M. S., Brinson, N. H., Doorey,
A., & Wilcox, G. (2016). Living in a
big data world:
predicting mobile commerce activity
through privacy concerns. Computers
in Human
Behavior, 58, 214–220.
https://doi.org/10.1016/j.chb.2015.12.
050
Eichenhofer, J. (2019). e-Privacy -
Theorie und Dogmatik eines
europäischen
Privatheitsschutzes im Internet-
Zeitalter [Theoretical and doctrinal
foundations of a
European privacy protection
regulation in the internet age].
Bielefeld: University of
Bielefeld.
Ellison, N. B., & boyd, d. (2013).
Sociality through social network sites.
In W. H. Dutton
(Ed.), The Oxford handbook of
Internet studies (pp. 151–172).
Oxford, UK: Oxford
University Press.
European Commission. (2015).
Special Eurobarometer 431: Data
protection. Brussels, BE.
Retrieved from
http://ec.europa.eu/public_opinion/arc
hives/ebs/ebs_431_en.pdf
THE SOCIAL MEDIA PRIVACY
MODEL 29
Evans, S. K., Pearce, K. E., Vitak, J.,
& Treem, J. W. (2017). Explicating
affordances: A
conceptual framework for
understanding affordances in
communication research. Journal
of Computer-Mediated
Communication, 22(1), 35–52.
https://doi.org/10.1111/jcc4.12180
Fox, J., & McEwan, B. (2017).
Distinguishing technologies for social
interaction: The
perceived social affordances of
communication channels scale.
Communication
Monographs, 84(3), 298–318.
https://doi.org/10.1080/03637751.201
7.1332418
Gibson, J. J. (2014). The ecological
approach to visual perception.
Psychology Press &
Routledge Classic Editions. Hoboken,
NJ: Taylor and Francis (Original work
published
1979).
Grimm, R., & Bräunlich, K. (2015).
Vertrauen und Privatheit [Trust and
privacy]. DuD
Datenschutz und Datensicherheit
[Data protection and data security], 5,
289–294.
Gusy, C. (2018). Datenschutz als
Privatheitsschutz oder Datenschutz
statt Privatheitsschutz?
[Data protection as privacy protection
or privacy protection as data
protection?].
Europäische Grundrechte Zeitschrift
[European Fundamental Rights
Journal], 45(9-12),
244–255.
Helm, P., & Eichenhofer, C. (2019).
Reflexionen zu einem social turn in
den privacy studies.
In C. Aldenhoff, L. Edeler, Hennig,
M., Kelsch, J., L. Raabe, & F. Sobala
(Eds.),
Digitalität und Privatheit [Digitality
and Privacy] (pp. 139–166). Bielefeld,
Germany:
transcript.
https://doi.org/10.14361/97838394466
14-009
Helsper, E. J. (2017). The social
relativity of digital exclusion:
Applying relative deprivation
theory to digital inequalities.
Communication Theory, 27(3), 223–
242.
https://doi.org/10.1111/comt.12110
Howard, P. N., & Parks, M. R. (2012).
Social media and political change:
Capacity,
constraint, and consequence. Journal
of Communication, 62(2), 359–362.
https://doi.org/10.1111/j.1460-
2466.2012.01626.x
THE SOCIAL MEDIA PRIVACY
MODEL 30
Johnson, C. A. (1974). Privacy as
personal control. In S. T. Margulis
(Ed.), Man-environment
interactions: Evaluations and
applications (pp. 83–100).
Stroudsburg, PA: Dowden,
Hutchinson & Ross.
Joinson, A. N. (2001). Self-disclosure
in computer-mediated
communication: The role of self-
awareness and visual anonymity.
European Journal of Social
Psychology, 31(2), 177–192.
https://doi.org/10.1002/ejsp.36
Laufer, R. S. [R. S.], & Wolfe, M.
(1977). Privacy as a concept and a
social issue: A
multidimensional developmental
theory. Journal of Social Issues, 33(3),
22–42.
https://doi.org/10.1111/j.1540-
4560.1977.tb01880.x
Li, Y. (2011). Empirical studies on
online information privacy concerns:
Literature review
and an integrative framework.
Communications of the Association
for Information Systems,
28(1), 453–496. Retrieved from
http://aisel.aisnet.org/cais/vol28/iss1/2
8
Madden, M. (2014). Public
perceptions of privacy and security in
the post-Snowden era.
Retrieved from
http://www.pewinternet.org/2014/11/1
2/public-privacy-perceptions/
Madden, M., & Rainie, L. (2015).
Americans’ attitudes about privacy,
security and
surveillance. Retrieved from
http://www.pewinternet.org/2015/05/2
0/americans-attitudes-
about-privacy-security-and-
surveillance/
Marwick, A. E., & boyd, d. (2014).
Networked privacy. How teenagers
negotiate context in
social media. New Media & Society,
16(7), 1051–1067.
https://doi.org/
10.1177/1461444814543995
Masur, P. K. (2019). Situational
privacy and self-disclosure:
Communication processes in
online environments. Cham,
Switzerland: Springer International
Publishing.
Metzger, M. J. (2004). Privacy, trust,
and disclosure: Exploring barriers to
electronic
commerce. Journal of Computer-
Mediated Communication, 9(4).
https://doi.org/10.1111/j.1083-
6101.2004.tb00292.x
THE SOCIAL MEDIA PRIVACY
MODEL 31
Moor, J. H. (1997). Towards a theory
of privacy in the information age.
ACM SIGCAS
Computers and Society, 27(3), 27–32.
https://doi.org/10.1145/270858.27086
6
Moore, B. (1984). Privacy: Studies in
social and cultural history. Armonk,
N.Y.: M.E.
Sharpe.
Nissenbaum, H. (2010). Privacy in
context: Technology, policy, and the
integrity of social
life. Palo Alto, CA: Stanford
University Press.
Ochs, C., & Büttner, B. (2018). Das
Internet als "Sauerstoff" und
"Bedrohung" [The internet
as oxygen and menace]. In M.
Friedewald (Ed.), DuD-Fachbeiträge.
Privatheit und
selbstbestimmtes Leben in der
digitalen Welt [Privacy and a self-
determined life in a
digital world] (pp. 33–80).
Wiesbaden, Germany: Springer
Vieweg.
Papacharissi, Z. (2010). A private
sphere: Democracy in a digital age.
Cambridge: Polity
Press.
Pedersen, D. M. (1999). Model for
types of privacy by privacy functions.
Journal of
Environmental Psychology, 19, 397–
405.
https://doi.org/10.1006/jevp.1999.014
0
Petronio, S. (2002). Boundaries of
privacy. Albany, NY: State University
of New York Press.
Qian, H., & Scott, C. R. (2007).
Anonymity and self-disclosure on
weblogs. Journal of
Computer-Mediated Communication,
12(4), 1428-1451.
https://doi.org/10.1111/j.1083-
6101.2007.00380.x
Quinn, K. (2014). An ecological
approach to privacy: “Doing” online
privacy at midlife.
Journal of Broadcasting & Electronic
Media, 58(4), 562–580.
https://doi.org/
10.1080/08838151.2014.966357
Quinn, K. (2016). Why we share: A
uses and gratifications approach to
privacy regulation in
social media use. Journal of
Broadcasting & Electronic Media,
60(1), 61–86.
https://doi.org/
10.1080/08838151.2015.1127245
THE SOCIAL MEDIA PRIVACY
MODEL 32
Rainie, L., Kiesler, S., Kang, R., &
Madden, M. (2013). Anonymity,
privacy, and security
Online. Retrieved from
http://www.pewinternet.org/2013/09/0
5/anonymity-privacy-and-
security-online/
Ramirez, A., Bryant, S., Erin, M.,
Fleuriet, C., & Cole, M. (2015). When
online dating
partners meet offline: The effect of
modality switching on relational
communication
between online daters. Journal of
Computer-Mediated Communication,
20(1), 99–114.
https://doi.org/10.1111/jcc4.12101
Saeri, A. K., Ogilvie, C., La Macchia,
S. T., Smith, J. R., & Louis, W. R.
(2014). Predicting
facebook users' online privacy
protection: Risk, trust, norm focus
theory, and the theory of
planned behavior. The Journal of
Social Psychology, 154(4), 352–369.
https://doi.org/
10.1080/00224545.2014.914881
Sarikakis, K., & Winter, L. (2017).
Social media users’ legal
consciousness about privacy.
Social Media + Society, 3(1), 1-14.
https://doi.org/10.1177/205630511769
5325
Schäwel, J. (2020). How to raise
users’ awareness of online privacy.
Duisburg, Germany:
University of Duisburg-Essen.
Schmidt, J.-H. (2014). Twitter and the
rise of personal publics. In K. Weller,
A. Bruns, J.
Burgess, M. Mahrt, & C. Puschmann
(Eds.), Digital formations: Vol. 89.
Twitter and
society (pp. 3–14). New York: Peter
Lang.
Seubert, S., & Becker, C. (2019). The
culture industry revisited:
Sociophilosophical
reflections on ‘privacy’ in the digital
age. Philosophy & Social Criticism,
45(8), 930–947.
https://doi.org/
10.1177/0191453719849719
Sevignani, S. (2016). Privacy and
capitalism in the age of social media.
Routledge research
in information technology and society:
Vol. 18. New York, NY: Routledge.
Slater, M. D. (2007). Reinforcing
spirals: The mutual influence of media
selectivity and
media effects and their impact on
individual behavior and social
identity. Communication
Theory, 17(3), 281–303.
https://doi.org/10.1111/j.1468-
2885.2007.00296.x
THE SOCIAL MEDIA PRIVACY
MODEL 33
Smith, H. J., Dinev, T., & Xu, H.
(2011). Information privacy research:
an interdisciplinary
review. Mis Quarterly, 35(4), 989–
1016.
Spottswood, E. L., & Hancock, J. T.
(2017). Should I share that?
Prompting social norms that
influence privacy behaviors on a
social networking site. Journal of
Computer-Mediated
Communication, 22(2), 26.
https://doi.org/10.1111/jcc4.12182
Taneja, A., Vitrano, J., & Gengo, N. J.
(2014). Rationality-based beliefs
affecting individual’s
attitude and intention to use privacy
controls on facebook: An empirical
investigation.
Computers in Human Behavior, 38,
159–173.
https://doi.org/10.1016/j.chb.2014.05.
027
Tavani, H. T. (2007). Philosophical
theories of privacy: Implications for
an adequate online
privacy policy. Metaphilosophy,
38(1), 1–22.
https://doi.org/10.1111/j.1467-
9973.2006.00474.x
Tavani, H. T., & Moor, J. H. (2001).
Privacy protection, control of
information, and privacy-
enhancing technologies. ACM
SIGCAS Computers and Society,
31(1), 6–11.
https://doi.org/
10.1145/572277.572278
Teutsch, D., Masur, P. K., & Trepte,
S. (2018). Privacy in mediated and
nonmediated
interpersonal communication: How
subjective concepts and situational
perceptions
influence behaviors. Social Media +
Society, 4(2), 1-14.
https://doi.org/
10.1177/2056305118767134
Treem, J. W., & Leonardi, P. M.
(2012). Social media use in
organizations. Exploring the
affordances of visibility, editability,
persistence, and association.
Communication
Yearbook, 36, 143–189.
https://doi.org/10.1080/23808985.201
3.11679130
Trepte, S., & Masur, P. K. (2017).
Need for privacy. In Zeigler-Hill, V.,
Shakelford, T. K.
(Ed.), Encyclopedia of personality and
individual differences. London, UK:
Springer.
https://doi.org/10.1007/978-3-319-
28099-8_540-1
Trepte, S., & Reinecke, L. (Eds.).
(2011). Privacy online. Perspectives
on privacy and self-
disclosure in the social web. Berlin,
Germany: Springer.
THE SOCIAL MEDIA PRIVACY
MODEL 34
Trepte, S., Reinecke, L., Ellison, N.
B., Quiring, O., Yao, M. Z., &
Ziegele, M. (2017). A
cross-cultural perspective on the
privacy calculus. Social Media +
Society, 3(1), 1-13.
https://doi.org/
10.1177/2056305116688035
Tsay-Vogel, M., Shanahan, J., &
Signorielli, N. (2018). Social media
cultivating perceptions
of privacy: A 5-year analysis of
privacy attitudes and self-disclosure
behaviors among
Facebook users. New Media &
Society, 20(1), 141–161.
https://doi.org/
10.1177/1461444816660731
Utz, S., & Krämer, N. (2009). The
privacy paradox on social network
sites revisited. The role
of individual characteristics and group
norms. Journal of Psychosocial
Research on
Cyberspace, 3(2). Retrieved from
http://cyberpsychology.eu/view.php?
cisloclanku=2009111001&article=2
Vitak, J. (2012). The impact of
context collapse and privacy on social
network site
disclosures. Journal of Broadcasting &
Electronic Media, 56(4), 451–470.
https://doi.org/
10.1080/08838151.2012.732140
Waldman, A. E. (2018). Privacy as
Trust. Cambridge, UK: Cambridge
University Press.
https://doi.org/
10.1017/9781316888667
Walther, J. B. (1996). Computer-
mediated communication. Impersonal,
interpersonal, and
hyperpersonal interaction.
Communication Research, 23(1), 3–
43.
https://doi.org/
10.1177/009365096023001001
Warren, S. D., & Brandeis, L. D.
(1890). The right to privacy. Harvard
Law Review, 4(5),
193–220.
Westin, A. F. (1967). Privacy and
freedom. New York, NY: Atheneum.
Wolfe, M., & Laufer, R. (1974). The
concept of privacy in childhood and
adolescence. In S.
T. Margulis (Ed.), Man-environment
interactions: Evaluations and
applications (pp. 29–
54). Stroudsburg, PA: Dowden,
Hutchinson & Ross.
THE SOCIAL MEDIA PRIVACY
MODEL 2
Abstract
Privacy has been defined as the
selective control of information
sharing, where control is key. For
social media, however, an individual
user’s informational control has
become more difficult. In
this theoretical article, I review how
the term control is part of theorizing on
privacy, and I
develop an understanding of online
privacy with communication as the
core mechanism by which
privacy is regulated. The results of this
article’s theoretical development are
molded into a
definition of privacy and the social
media privacy model. The model is
based on four
propositions: Privacy in social media is
interdependently perceived and valued.
Thus, it cannot
always be achieved through control. As
an alternative, interpersonal
communication is the
primary mechanism by which to ensure
social media privacy. Finally, trust and
norms function as
mechanisms that represent crystallized
privacy communication. Further
materials are available at
https://osf.io/xhqjy/
Keywords: privacy, control, social
media, affordances, communication,
social media
privacy model, definition of privacy
THE SOCIAL MEDIA PRIVACY
MODEL 3
The Social Media Privacy Model:
Privacy and Communication in the
Light of Social Media
Affordances
In historical and current theories about
privacy, control has been perceived as
an
important defining term. The majority
of privacy scholars understand control
as the means by
which to regulate and finally
experience privacy (Altman, 1975;
Burgoon, 1982; Petronio, 2002).
The underlying assumption is that the
more users can control access to their
personal lives, or—
more technically—to their data, the
more privacy they experience. Also, the
most current
understanding held by social media
users is that they need control to
achieve privacy and
informational self-determination
(Marwick & boyd, 2014). The majority
of 80% to 90% of U.S.
Americans (Madden & Rainie, 2015)
and Europeans (European Commission,
2015) say that it is
important to them to be in control of
determining who can obtain
information about them and
what information is collected about
them (see also Sarikakis & Winter,
2017).
There is no question that users face
decreasing informational control while
communicating via social media. Due
to their networked nature, social media
applications do not
allow users to control what friends,
acquaintances, institutions, or
companies do with the
information, pictures, and stories that
are shared online (Marwick & boyd,
2014). Further, social
media communication takes place in
larger and larger parts of users’ lives.
All of the more
current applications and devices
aggregate information and exert
automatic control (Anderson,
Rainie, & Duggan, 2014). As a reaction
to the increasing amounts of data that
are exchanged and
the sociality of such data, 91% of users
perceive that they have lost control
over how their
personal information is collected and
used by friends, acquaintances, and
colleagues (Quinn,
2014) and especially by companies and
governments (Madden, 2014).
These two observations—the
understanding of privacy as control on
the one hand and the
experience of decreasing control over
information while using social media
on the other—can be
THE SOCIAL MEDIA PRIVACY
MODEL 4
termed a control issue of privacy. In the
remainder of this article, I will suggest
an understanding
of privacy that is adequate for social
media use and the requirements
emerging from this issue.
The Relationship of Privacy and
Control
Privacy is a concept that has been
considered and defined in very
different disciplines,
from descriptive, empirical, and
normative perspectives (Sevignani,
2016; Trepte & Reinecke,
2011). In earlier days, privacy was
considered a human right and identified
as the “right to be let
alone” (Warren & Brandeis, 1890, p.
75). Later and more specifically,
privacy was defined as
“the claim of individuals, groups, or
institutions to determine for themselves
when, how, and to
what extent information about them is
communicated to others” (Westin,
1967, p. 7) or “the
selective control of access to the self”
(Altman, 1975, p. 24).
Informational control has only seldom
been defined, but the most common
definitions
touch either a static or behavioral
aspect of control: Informational control
foremost means that
owners of a certain piece of
information have a choice over
whether, when, and to what extent
they will disclose or withhold personal
information (Crowley, 2017; Tavani,
2007). Here control
is static, a question of more or less, yes
or no. It can be understood as an option
or an available
mechanism. Then, control can be
exerted actively (e.g., restricting access
to information,
audience segregation, self-censorship,
encryption), ambiguously (e.g.,
softening the truth,
obfuscating information, or engaging
in other forms of partial disclosure), or
passively (e.g.
unintentionally omitting information)
(Crowley, 2017; Ochs & Büttner,
2018). In this rather
behavioral understanding,
informational control is executed and
experienced by the individual
person. In both perspectives, control is
centered around the individual and
individual decision
making.
The majority of privacy theories are
devoted to two—somewhat
contradictory—
paradigms: I will call the first
paradigm “privacy as control,” because
here, privacy and control
THE SOCIAL MEDIA PRIVACY
MODEL 5
are strongly connected, and, the second
paradigm “privacy and control,”
because here, privacy
and control are treated as separate
constructs with conditional
relationships. I will then suggest a
third perspective that redefines the
meaning and impact of control and the
conditions among
which control becomes relevant. This
perspective will be summarized in the
social media privacy
model.
Paradigm 1: Privacy as Control
In the seminal work by Altman (1975)
and the privacy regulation model of
self-disclosure
(Derlega, Metts, Petronio, & Margulis,
1993), control was set forth as the
crucial mechanism of
privacy. More recent
conceptualizations have also referred to
control as a precondition of privacy
(Petronio, 2002). Even in their very
first conceptualizations of privacy,
Warren and Brandeis
(1890) referred to privacy as the right
to control what others publish about
oneself. In an
overview of privacy theories, Smith,
Dinev, and Xu (2011) investigated 448
publications on
privacy. They found that—besides an
understanding of privacy as a value—
the cognate-based
understanding of privacy as control has
dominated the social sciences.
The vast majority of privacy scholars
have referred to control as a dynamic
behavior in
the process of privacy regulation to
grant access or to deny access. Altman
(1975) suggested a
process model with three steps: First,
an individual assesses the desired level
of privacy; then the
individual eventually regulates privacy
by controlling interpersonal
boundaries; and then, the
individual again assesses the achieved
level of privacy. In his flow-model the
crucial role that
was assigned to control becomes
apparent. On the basis of this notion,
Petronio (2002) articulated
how control is the engine of privacy
management. In her understanding, an
individual jointly
manages and coordinates rules with
others while interacting with them.
Here again, control is not
only the behavior through which
privacy can be gained, but control is
also the means by which to
measure the status quo of privacy, and
in turn, it will foster the extent to which
privacy regulation
THE SOCIAL MEDIA PRIVACY
MODEL 6
is further engaged in through an
exertion of control.
Privacy scholars have also referred to
the question of what is being
controlled. Here, in
particular, the control of access to
boundaries and the control of the flow
of an interaction were
addressed as the topics or processes
that needed to be controlled (Johnson,
1974; Wolfe &
Laufer, 1974). Further, control over
stimuli that impinge upon a person
were articulated as things
that need to be controlled (Wolfe &
Laufer, 1974). Margulis (1974)
explained that control refers
to all matters being exchanged between
individuals: “Privacy, as a whole or in
part, represents the
control of transactions between
person(s) and other(s)…” (p. 77).
In some theories, control has been used
almost interchangeably with privacy.
For
example, Derlega et al. (1993) stated
that “…privacy represents control over
the amount and kind
of information exchange that persons
have with one another” (p. 67). Then,
Burgoon (1982)
differentiated between four dimensions
of privacy, all of which refer to how
much control an
individual has: Possessing physical
privacy refers to whether and how
much control an individual
perceives to have over physical
boundaries. Social privacy refers to
how much control an
individual perceives to have over the
access of others to the person’s
environments.
Psychological privacy refers to how
much control an individual perceives to
have over emotional
and cognitive input and output. Finally,
informational privacy refers to how
much control an
individual perceives to have over the
use of personal data. In this
conceptualization, the ability to
exert control is the key to an
individual’s privacy perception and, in
turn, regulation. Many
empirical studies have addressed the
relationship between control and
privacy, but only a
minority of studies have supported the
notion that privacy behavior is related
to informational
control (Brandimarte, Acquisti, &
Loewenstein, 2013).
In sum, studies that have been based on
this first paradigm have underscored
the idea that
individuals exert control to achieve
privacy. In turn, privacy should be
achieved if a certain level
THE SOCIAL MEDIA PRIVACY
MODEL 7
of control is successfully executed and
maintained as the status quo. However,
these
conceptualizations of privacy suggest a
linear relationship between privacy and
control. They
assume that “…the more one has
control over this information exchange,
the greater the amount
of privacy one has in a social
relationship” (Derlega et al., 1993, p.
67). However, previous
empirical research did not find a linear
relationship between privacy and
control. Hence, there is
a mismatch between the theoretical
assumption that privacy and control are
closely related on the
one hand and the rare empirical data
supporting this notion on the other.
Paradigm 2: Privacy and Control
In social media, an individual person
cannot easily execute control because
personal
information is exchanged between
many parties and with a broad range of
applications. Users
experience social media as more
confusing, demanding, and complex
with regard to the control
that they have over their personal
information than face-to-face
communication (Marwick
& boyd, 2014; Quinn, 2014). Woo
(2016) expressed this confusion while
mimicking the
presumed thoughts of a user: “Please
do contact me and give me benefits, but
I still do not want
to fully give up my control (but I do
not know how to have that control)” (p.
954). In other
words, users want to take advantage of
the networked nature of social media,
are painfully aware
of the deficits in control, but have not
yet found solutions for how to embrace
their needs for both
gratification and informational control.
This process of weighing privacy risks
and social
gratifications has also been
investigated under the umbrella of the
privacy calculus (Trepte et al.,
2017).
To follow up on the sentiment that
users wish to have informational
control but that
control seems to contradict the
networked nature of social media,
Moor (1997) and later Tavani
(2007) reflected on the relationship
between privacy and control. They
argued that control and
privacy should be seen as separate
constructs and that privacy and control
serve very different
THE SOCIAL MEDIA PRIVACY
MODEL 8
functions. With the Restricted
Access/Limited Control (RALC) theory
of privacy, these authors
defined privacy in terms of the
individual’s protection from intrusion
and information gathering
by third parties. They argued that
control in the information age is
impossible and further that
“We can have control but no privacy,
and privacy but no control” (Tavani &
Moor, 2001, p. 6).
They suggested that privacy and control
should be separated such that privacy is
a concept and a
value that is defined by being protected
from information access by others,
whereas control is one
mechanism that can be used to manage
and justify privacy. Control may be
exerted through
choice, consent, or correction. In the
flow of the exchange of digital
information, people choose
situations according to their
communication goals, level of access,
and emerging privacy needs
(Trepte & Masur, 2017); then, privacy
is maintained through the processes of
consent, and
finally, corrections allow people to
restore their privacy when it gets lost or
threatened. For the
upcoming social media privacy model,
I will refer to this notion that control is
one mechanism
among others, and I will explain that
for all processes (i.e., choice, consent,
correction),
individuals have to get in touch with
others and communicate their motives
and aims.
With her theory of contextual integrity,
Nissenbaum (2010) also addressed the
contextual
requirements as boundary conditions,
regardless of whether control is a
functional mechanism or
not. She suggested that the two sets of
theories be married: those referring to
privacy as a
constraint on access and those referring
to privacy as a form of control. In her
theory of
contextual integrity, Nissenbaum
(2010) described control as one
“transmission principle” (p.
145) that defines how information is
exchanged. Other transmission
principles are reciprocity and
confidentiality. Control as a
transmission principle is appropriate
only if it fits into the particular
context, the subject that users are
talking about, the type of information
that is to be exchanged,
and the actors they communicate with.
From this point of view, there can be
privacy without
control in situations in which control is
inappropriate or not available (Laufer
& Wolfe, 1977;
THE SOCIAL MEDIA PRIVACY
MODEL 9
Slater, 2007).
Current privacy theories pushed the
idea of control as a dynamic attribute of
the situation
one crucial step further. According to
Dienlin’s (2014) privacy process
model, individuals assess
the controllability of the context and
behavior. Masur (2019) adds an
analysis of what is being
controlled by entangling interpersonal
(e.g., the relationship between
interaction partners) and
external factors (e.g., the architecture of
a room) in his theory of situational
privacy and self-
disclosure. Depending on the situation,
these interpersonal and external factors
can be controlled
to different degrees, and in turn, they
can elicit differential levels of self-
disclosure. Self-
disclosure will be understood as “the
intentional communication of
information about the self to
another person or group of people”
(Masur 2019, p. 70) in the remainder of
this article.
The notion that privacy and control are
not necessarily connected has been
supported by
previous research (Saeri, Ogilvie, La
Macchia, Smith, & Louis, 2014). For
example, Zlatolas,
Welzer, Heričko, and Hölbl (2015)
demonstrated that privacy norms,
policies, and awareness but
not privacy control were related to the
self-disclosures of N = 673 Slovenian
Facebook users. In
a U.S. sample of N = 249 Facebook
users, Taneja, Vitrano, and Gengo
(2014) found that
perceived behavioral control and the
intention to engage in privacy-related
behavior were
unrelated. Eastin et al. (2016)
investigated how different variables
predicted mobile commerce
activity and found that control was the
one that explained the smallest amount
of variance. In
particular, trust and attitude toward
mobile commerce were more important
predictors than
control. In sum, individuals who
perceived that they had control over
their personal data did not
necessarily feel they had more privacy
and did not increasingly engage in self-
disclosure. Further,
trust and norms were identified as
important alternative mechanisms of
privacy (Brandimarte et
al., 2013; Eastin et al., 2016;
Nissenbaum, 2010; Zlatolas et al.,
2015). I will refer to both
findings in the social media privacy
model.
THE SOCIAL MEDIA PRIVACY
MODEL 10
The Interplay of Affordances, Control,
and Privacy
The lack of a relation between privacy
and control might hint that the interplay
of the two
variables is not linear and is actually
more complex (Laufer & Wolfe, 1977).
The relation
between control and privacy should
become clearer if the social media
boundary conditions that
make control a functional mechanism
in one situation but impractical in
another are elucidated.
Social Media and its Boundary
Conditions for Privacy
Carr and Hayes (2015) defined social
media as “…Internet-based channels
that allow
users to opportunistically interact and
selectively self-present, either in real-
time or
asynchronously, with both broad and
narrow audiences who derive value
from user-generated
content and the perception of
interaction with others” (p. 50). They
further pointed out that users’
interaction will increasingly be
influenced by social media affordances.
Further, social media has
been characterized by its content, its
users, and its infrastructure in previous
definitions (Howard
& Parks, 2012). The most prominent
examples of social media are social
network sites (e.g.,
Facebook, Instagram, LinkedIn,
Google+), multimedia platforms (e.g.,
Youtube, Slideshare,
Soundcloud), weblogs (e.g., personal
diaries of mothers, scholars, or self-
appointed or paid
influencers), and microblogs (e.g.,
Twitter). In struggling to develop a
definition of social media,
scholars have pointed to the fact that
social media channels are formally
understood as methods
of mass communication but that they
primarily contain and perpetuate
personal user interactions
(Carr & Hayes, 2015; Papacharissi,
2010). In this sense, social media can
be referred to as
personal publics (Schmidt, 2014). As a
consequence, users cannot always
clearly define the
somewhat blurred lines between
personal and public or between private
and professional
communication. They feel that contexts
collapse and converge (Papacharissi,
2010; Vitak, 2012).
In sum, social media is characterized
by the following boundary conditions:
the content, its flow
and further uses (Howard & Parks,
2012); the communication practices
that users perceive as
THE SOCIAL MEDIA PRIVACY
MODEL 11
their options for exerting control or for
achieving privacy with other means;
and social media
affordances (Carr & Hayes, 2015). In
the following, I will analyze of how the
interplay of these
boundary conditions is related to
control and how it determines different
privacy perceptions and
behaviors. Tables 1 and 2 in the
Supplemental summarize this
theoretical development.
Social Media Boundary Condition 1:
Content, its Flow and Uses
What exactly does an individual strive
to control? The sooner we come to
understand
what individuals strive to control, the
better we can evaluate whether control
can be experienced
in social media. According to most
studies, personal information refers to
the content that people
strive to control in order to maintain
their privacy in social media. Metzger
(2004) referred to
personal information as the content to
be controlled. Quinn (2014) suggested
different layers of
how privacy can be maintained. On the
“content layer,” users’ experience of a
lack of control
leads them to limit the information they
post or even to post false information.
Sarikakis and
Winter (2017) added on the basis of
their qualitative work that users do not
differentiate between
personal information and personal data.
Instead, they define the degree of
intimacy or privacy
needed for a certain piece of
information or data.
Then, besides personal information, the
flow and use of the content needs to be
considered. Social media advocates
specifically address where online
information is forwarded,
archived, and sold. They emphasize
users’ concerns about how little control
they have over the
flow and use of personal information
(Marwick & boyd, 2014; Quinn, 2014;
Tsay-Vogel,
Shanahan, & Signorielli, 2018). This
refers to the forms personal
information takes, to where it
ends up and how it is used. In the
following, personal information, its
flow, and further uses will
be considered as what users strive to
control.
Social Media Boundary Condition 2:
Practices of Control
Actively exerting control expresses
informational self-determination,
which implies
THE SOCIAL MEDIA PRIVACY
MODEL 12
having power over information and
agency in decisions regarding this
information. In turn, a loss
of control would mean that other
behavioral options are out of reach and
that individuality (Buss,
2001), power, and agency are severely
threatened (Brummett & Steuber,
2015). Control also
comes along with risk avoidance:
Users have identified the most
important pieces of information
that they want to control as the contents
of their emails, the contents of their
online chats, and
their location (Cho, Lee, & Chung,
2010). As long as they have control
over this information,
they can avoid being harassed, bullied,
financially exploited by companies, or
surveilled by
governmental institutions.
How is control executed and achieved?
First, informational control can be
identified as an
individual’s perception that he or she
has a choice about whether to withhold
or disclose
information (Crowley, 2017; Johnson,
1974). Choice is the first step and
determines whether
control can be exerted and to what
degree (Wolfe & Laufer, 1974). Then,
in the next step, when
choice is available, it has to be put into
action. Two active practices of control
in social media are
consent and correction (Tavani, 2007).
Consent refers to the extent to which
users agree that a
certain piece of information will be
passed along. Correction means that
users are able to
withdraw from this agreement.
Whereas choice was identified long
ago as a practice of control
(Johnson, 1974), consent and correction
were suggested as social media
practices (Tavani, 2007).
Further, informational control can also
be put into practice by the selective
sharing of
information, self-censorship, audience
segregation, and encryption (Ochs &
Büttner, 2018). All
of these options are directed by the
individual and can be considered to be
ego-centered. It will be
important to also find terms for
interpersonal privacy regulation
behaviors.
Social Media Boundary Condition 3:
Affordances
Social media can be characterized by
affordances. The term affordance,
initially
suggested by Gibson (1979/2014),
means that the environmental
characteristics of a certain entity
THE SOCIAL MEDIA PRIVACY
MODEL 13
are not static but differently perceived,
experienced, and as such shaped by
humans. In the case of
social media, this understanding is
more than suitable. Of course the
process of shaping or
“furnishing” (Gibson, 1979/2014, p.
78) social media influences its further
uses. For example,
teenage users regulate their privacy
through social steganography, an
idiomatic language that is
understood only by their peers but not
their parents instead of managing their
audiences by
editing their friends lists and
systematically blocking their parents or
certain peers (boyd, 2014).
Inventing and using this kind of
idiomatic language might influence
users’ style of
communication and interactions on
social media. A selection of four
affordances have repeatedly
been shown to be particularly important
for social media: anonymity,
editability, association, and
persistence (boyd, 2008; Evans, Pearce,
Vitak, & Treem, 2017; Treem &
Leonardi, 2012). The
affordances of anonymity and
editability allow users to exert control.
By contrast, the affordances
of association and persistence
challenge users’ ability to exert
control. Both clusters of
affordances—those enhancing (Table
1) as well as those challenging control
(Table 2)—have
different implications for how content
is controlled, which practices of
control are available, and
how they affect privacy regulation in
social media realms. In the following, I
intertwine the
results from previous research on
privacy, the content and practices of
control, and, affordances.
Anonymity. The affordance of
anonymity describes the idea that other
social media
agents such as other private people,
institutions, or companies do not know
the source of the
message or the sender (Evans et al.,
2017). For social media use, being
unknown to others and
anonymously using social media is rare
(Rainie, Kiesler, Kang, & Madden,
2013). Nevertheless,
anonymity is still appreciated
occasionally. For example, users
commenting on other users’ posts
in self-help or political forums can
decide to keep their interactions
anonymous. In addition, users
of dating websites might at least partly
or temporarily use such sites
anonymously (Ramirez,
Bryant, Erin, Fleuriet, & Cole, 2015).
Anonymity is not a question of “on” or
“off” but is flexible
THE SOCIAL MEDIA PRIVACY
MODEL 14
and scalable (Evans et al., 2017).
Anonymity has spurred enormous
attention in research on
computer-mediated communication
(CMC; Joinson, 2001). Here,
anonymity is specifically
considered a means to control the
access of certain individuals (Qian &
Scott, 2007). Further,
while being online anonymously, an
individual can deliberately decide what
information to share,
with whom to share it, and what to
withhold (Qian & Scott, 2007). While
being online
anonymously, the receiver of a
message cannot provide the CMC with
face-to-face behaviors,
and as such, the control lies in the
hands of the sender (Ben-Ze'ev, 2003).
Control over content, its flow, and its
uses is possible because, in a state of
anonymity, all
of these are disconnected from the user.
Although full anonymity is not afforded
by social media,
U.S. American users occasionally keep
their posts anonymous while using
social media with the
clear aim of exerting control (Rainie et
al., 2013). For example, they disguise
their location or
delete cookies so that companies
cannot identify them. An interview
partner in Sarikakis and
Winters’ (2017) study said: “Well
when I use fake names or email
addresses and fake birthdates I
think that’s the only control you can
try to have” (p. 9). Two aspects,
though, might mitigate the
perception of control. First, users know
that they leave traces behind, and once
they have posted
content online—even if it was posted
anonymously—they might be traceable
because of other
information they left online; second,
anonymity is usually used only to some
extent (e.g., by
leaving one’s name but not one’s
address), and users acknowledge that
with partial anonymity,
they experience only partial control,
and in turn, only partial privacy (Rainie
et al., 2013). What
are the available practices users have to
exert control? First, being online
anonymously can be
considered a question of choice. Woo
(2016) suggested that anonymity—or
by contrast,
identifiability—is the most important
issue in online privacy. He argued that
on the social web,
users should have “islands” of
anonymity. He has even encouraged
people to lie and to have
secrets with the aim of regaining
control and autonomy. Then, however,
in an anonymous setting,
THE SOCIAL MEDIA PRIVACY
MODEL 15
consent and corrections are somewhat
disconnected from the user as these are
identity-related
practices.
Anonymity can be an enactment of
control by disguising or lying about
one’s identity or
by leaving it unspecified for some
applications and occasions. Also,
people may leave the source
of their own messages unknown. And,
of course, these enactments of control
might be applied
when interacting with some users but
not with others. In sum, the affordance
of anonymity is
related to informational control, which
has also been demonstrated in
empirical studies (Fox &
McEwan, 2017).
In previous research on privacy,
anonymity has played a crucial role.
Here, it was even
understood as a “type” of privacy
among other types such as solitude,
intimacy, or reserve
(Pedersen, 1999; Westin, 1967). In
sum, anonymity allows users to exert
control and, in turn, it
increases an individual’s subjective
experience of privacy by not being
identifiable at all or by
selectively presenting parts of one’s
own identity (Smith et al., 2011; Woo,
2016).
Editability. While using social media,
users interact remotely in terms of time
and space.
This gives them the chance to edit their
online communications before and
after the
communications are seen by others
(Treem & Leonardi, 2012). Editability
is an affordance that
was previously addressed as important
for CMC in the hyperpersonal model
(Walther, 1996):
Senders of online messages self-
selectively present themselves online
by primarily transmitting
cues and messages that they want
others to get to know and that put them
in a positive light. In
addition, although editing one’s identity
is part of any social interaction, social
media platforms
offer more freedom in terms of what,
how, and when these interactions are
edited. Editing allows
users to rehearse, change, package, or
literally craft a message and, in turn, to
rehearse, change,
and craft their personal appearance.
Editability allows the message sender
control over content and its flow and
uses because
THE SOCIAL MEDIA PRIVACY
MODEL 16
users have the chance to ponder the
consequences of their posts (Treem &
Leonardi, 2012).
Further, users may control the flow and
further use of their content by
articulating the lists of
online friends that they connect with or
by using a private or public profile
(Ellison & boyd,
2013). The availability of control
practices is highly supported by social
media’s affordance of
editability (Fox & McEwan, 2017).
Users have a choice to either intuitively
post their thoughts or
to pictorially represent their nonverbal
cues. Editing can be considered an
active enactment of
control because users deliberately
choose what to reveal to certain
audiences or what to withhold
(Crowley, 2017). Further, corrections
of one’s own posts and decisions are
possible and can also
be conceived as an enactment of
control (Crowley, 2017).
Control over the flow of information
might also foster subjective
experiences of privacy. In
privacy research, exerting control over
the flow of an interaction was often
understood
synonymously with control or as a
transmission principle that guaranteed
privacy (Nissenbaum,
2010).
Association. Social media platforms are
primarily used because they offer users
the chance
to connect with others and to stay in
touch. Users have articulated the idea
that communication is
their most important motive for using
social network sites (Quinn, 2016).
Consequently, the most
important affordance of social media is
the associations that are created or
maintained between
interaction partners (Ellison & boyd,
2013; Treem & Leonardi, 2012).
The affordance of association and the
chance to exert control over content, its
flow and
uses seem strikingly incompatible. In
their cross-sectional study of N = 875
Mechanical Turk
workers, Fox and McEwan (2017)
demonstrated that informational
control and network
association were negatively related.
Control is exerted by an individual
person, and as such, the
individual person is at the center if not
entirely responsible for achieving
privacy via control. This
is clearly expressed in users’ current
understanding of control. For example,
participants in
THE SOCIAL MEDIA PRIVACY
MODEL 17
Sarikakis and Winters’ (2017) study
identified the individual as the
legitimate “controller” of
privacy (p. 6). Also, boyd (2014)
argued that exerting control with the
aim of achieving privacy
requires the individual’s full
consideration, power, knowledge, and
skills. The only control
practice available would be to
completely withdraw (i.e., not to
participate) and to accept the
disadvantages that come along with
such a decision. Then, ambiguous and
passive enactments of
control might be used, but these have
the same disadvantages.
In sum, control as an issue and a
concept takes the perspective of the
individual. In other
words, it is an “ego-centered need”
(Papacharissi, 2010, p. 144). By
contrast, association is an
interindividual concept. Very likely,
one person’s control ends at the point
where another
person’s control starts. Hence, as much
as the social web is an interdependent
realm involving
other parties, control is not the means
to ensure or exert privacy. On the basis
of these
considerations—and this will be
important for the upcoming
propositions on social media
privacy—other means and mechanisms
to guarantee the subjective experience
of privacy have
become necessary: Users communicate
with each other to ensure privacy. And
further, if
communication is not possible, they
choose communication partners—
individuals, organizations,
institutions—that they can trust. Trust
has been shown to be crucial for
ensuring the perception of
privacy and subsequent online
disclosure (Metzger, 2004). Trust can
be established by personal
communication (Petronio, 2002) and by
norms that the individual user can rely
on (Saeri et al.,
2014). As such, in the upcoming
propositions and the social media
privacy model, trust,
communication, and norms are
conceptualized as the core mechanisms
to ensure privacy beyond
control.
Persistence. The affordance of
persistence addresses the durability of
online expressions
and content (boyd, 2014) and the idea
that after personal information is
published online, it is
automatically recorded and archived
and is consequently replicable (boyd,
2008). It means that
THE SOCIAL MEDIA PRIVACY
MODEL 18
data remain accessible in the same
form over long periods of time and for
diverse and unforeseen
audiences (Evans et al., 2017; Treem &
Leonardi, 2012). Some authors have
emphasized the
positive outcomes that persistence may
have, namely, that it allows knowledge
to be sustained,
creates robust forms of
communication, establishes the growth
of content (Treem & Leonardi,
2012), and allows for authenticity and
longevity (boyd, 2008).
However, as persistence comprises the
endless and infinite nature of online
data, it also
expresses a loss of informational self-
determination. Persistence seems
incompatible with
control. It evokes the idea that control
over content, its flow, and uses is
impossible because once
personal information is posted online,
it is no longer under the sender’s
control. With regard to
control practices, social media users
have the choice to completely
withdraw from online
interactions, to not post their
information, and thus to avoid its
persistence. At the same time, this
would prevent them from receiving the
myriad benefits that come along with
data sharing and
thus does not seem to be a question of
freedom of choice anymore. Also, as
soon as the choice is
made to participate, other control
practices such as consent and correction
significantly decrease.
Finally, once given, consent decreases
a person’s chances to subsequently
correct previous online
communication. Users know that they
do not have control over how persistent
their data will be,
and this significantly decreases their
subjective experience of privacy
(Rainie et al., 2013). The
lack of control over personal
information due to the persistence of
all kinds of online information
can be considered one of the key issues
of online lives and the subjective
experience of privacy
today. To react to users’ needs to
foresee and understand persistence
(Rainie et al., 2013),
communication, trust and norms seem
to be adequate ways to ensure privacy.
The Social Media Privacy Model
Social media privacy is based on
interpersonal processes of mutual
disclosure and
communication (Altman, 1975;
Petronio, 2002). Further, it can be
considered a value that is co-
THE SOCIAL MEDIA PRIVACY
MODEL 19
developed by engaging in
communication and that is expressed
by a shared perception
(Nissenbaum, 2010; Smith et al.,
2011). On the basis of these
considerations, I propose:
Proposition 1: Privacy is
interdependently perceived and valued.
In contrast to privacy, control is at the
center of the individual person. Control
is exerted
by the individual person, and it can be
considered to be ego-centered
(Papacharissi, 2010;
Sarikakis & Winter, 2017). Social
media platforms aim for
connectedness, interdependence, and
sociality and can be described as
social-centered (Ellison & boyd, 2013).
As a consequence,
social media privacy cannot be
sufficiently achieved by exerting
control. Other mechanisms are
necessary to ensure social media
privacy.
Proposition 2: Privacy cannot be
satisfactorily achieved by exerting
control in social media.
Instead of striving for control as an
end-game of privacy, the opposite is
necessary to
experience privacy in social media.
Users need to constantly communicate
with each other as
well as with institutions and companies
to ensure their privacy. They need to
engage in both
interpersonal and deliberative
communication processes.
Interpersonal communication is
understood as interactions between
users as well as interactions between
the individual user and
others who represent third parties such
as institutions and companies.
Deliberation is defined as
either informal or institutionalized
interaction among internet users (and
eventually
representatives of governments,
institutions, or companies), involving
rational-critical decision
making and the earnest aim to find a
solution (Burkhalter, Gastil, &
Kelshaw, 2002).
Proposition 3: Interpersonal
communication is a mechanism by
which social media privacy can
interdependently ensured and put into
effect.
However, not all actions and steps of
online media use can be accompanied
by
communication processes. For many if
not most questions of privacy, people
can rely on past
experiences. Here, communication and
deliberation crystallize into a stable
relationship-based
THE SOCIAL MEDIA PRIVACY
MODEL 20
result or even solution, i.e. trust (or
mistrust) and norms (or anomia). Trust
can be defined as an
anticipation and expectation of how a
person or institution will behave and as
such reduces
uncertainty (Waldman, 2018, p. 4).
Trust has been shown to ensure privacy
on the basis of
longstanding communication and
reliable bonds (Saeri et al., 2014). Trust
can be conceived as
both a crucial factor of influence in
decisions over self-disclosure and as a
result of
communication (Saeri et al., 2014). In
turn, mistrust—which has not yet been
addressed in
privacy research—is a menace to
privacy and as such should initiate
communication. In a
qualitative study on privacy
perceptions, Teutsch, Masur, and
Trepte (2018) demonstrated that
participants perceived that they had lost
control and had substituted trust for
control. One of the
interview partners said, “Well, privacy
is absolute trust between conversational
partners and
absolute, absolute certainty that the
subject of conversation will stay within
this sphere” (p. 7).
Eichenhofer (2019) suggested that the
“trust paradigm” should point toward a
more current
perspective on privacy regulation via
trust in contrast to privacy regulation
via control or self-
determination.
In the case of privacy regulation, both
social and legislative norms come into
play (Gusy,
2018; Spottswood & Hancock, 2017;
Utz & Krämer, 2009). Social norms are
understood as
social pressure to engage in a certain
kind of behavior and are established by
what others approve
of (injunctive norms) and what they
actually do (descriptive norms) (Saeri
et al., 2014). Although
legislative norms are coined by
jurisprudence, regulated by law (and
not on the basis of observing
others), they share with social norms in
that they prescribe a certain behavior
and that they allow
for sanctions in case this behavior is
not shown. Previous research has
shown that trust, and
norms are the keys to obtaining privacy
(Marwick & boyd, 2014; Quinn, 2014).
To establish trust
and norms, of course, communication
is necessary.
Proposition 4: Trust and norms
function as privacy mechanisms that
represent
THE SOCIAL MEDIA PRIVACY
MODEL 21
crystallized privacy communication.
I suggest that control and
communication have found a new
balance in social media
communication: Control is losing
control and communication is gaining
power. In other words,
users do not solely rely on and strive
for control in the first place but strive to
communicate about
privacy to establish norms and trust
and even sometimes to regain control.
In fact, privacy’s
interdependence is expressed and put
forward by interpersonal
communication. This emerging
social turn in privacy theory is also
acknowledged in the definition of
privacy.
I define privacy by an individual’s assessments
of (a) the level of access to this
person in an interaction or relationship with
others (people, companies,
institutions) and (b) the availability of the
mechanisms of control,
interpersonal communication, trust, and norms
for shaping this level of access
through (c) self-disclosure as (almost intuitive)
behavioral privacy regulation
and (d) control, interpersonal communication,
and deliberation as means for
ensuring (a somewhat more elaborated)
regulation of privacy. In social media,
then, the availability of the mechanisms that
can be applied to ensure privacy
are crucially influenced by the content that is
being shared and the social media
affordances that determine how this content is
further used.
In the following, I will summarize the
theoretical rationale developed in this
article in the
chronology of a communicative
process. Further, I will show how the
individual’s privacy
assessments referred to in the four
propositions eventually lead to different
forms of privacy
regulation behaviors. The process is
illustrated in Figure 1. The following
steps are meant to
make the model accessible for
empirical investigation.
The first part of the flowchart refers to
the social media user’s subjective and
initial
assessment: All humans have
individual levels of access they
perceive as being more or less
adequate and comfortable. This
dispositional level of access is a
quantifiable dimension varying
THE SOCIAL MEDIA PRIVACY
MODEL 22
between high and low levels and
expressing the individual’s
dispositional willingness to self-
disclose. In contrast, the
communication goal is rooted in the
situation and defines what is to be
achieved in this particular situation.
The individual’s communication goals
in social media are
manifold and important to consider
when assessing online privacy. They
can most likely be
understood as a qualitative scenario of
what the individual user wants and
needs to communicate
in a certain situation. Hence, the point
of departure for each and any
consideration about privacy
are the more or less consciously asked
questions: How do I feel, what do I
need, and what is my
goal for this particular situation?
The second part of the model refers to
the social media boundary conditions
that are
encountered. Here, content and
affordances dynamically interact (as
indicated with the
multiplication sign) with an
individual’s initial assessment. The
individual weighs the ideal level
of access and his/her communication
goals against social media boundary
conditions by
considering what content is shared,
where; how it might flow from one user
or institution to
another; and, how it might be used.
Social media content becomes dynamic
as it is displayed and
shared. Affordances represent this
dynamic and, together with the
individual’s dispositions and
goals, shape the available privacy
mechanisms: control, trust, norms, and
interpersonal
communication. Hence, users assess
whether they have a choice, whether
they can rely on trust or
norms, or whether they will (have to)
engage in interpersonal
communication.
The third part of the model refers to the
subjective experience of privacy. The
individual
experiences a certain level of access
that results from the individual’s goals
on the one hand and
the social media boundary conditions
and privacy mechanisms that are
applied to actively
regulate privacy on the other. This
experience is here understood as the
rather unfiltered
accumulation of external stimuli and
internal needs and, then, results in a
more elaborated re-
assessment, i.e. the privacy perception
that can be verbalized, and is
empirically accessible.
THE SOCIAL MEDIA PRIVACY
MODEL 23
The privacy perception results in
different forms of privacy regulation
behaviors. First,
self-disclosure belongs to the most
intuitive regulation behaviors and
includes all information
intentionally shared (or not shared)
with others. And, for the case that the
privacy mechanism of
control is available and considered
adequate for a certain communication
goal, users exert control
actively and intentionally by restricting
access to information, audience
segregation, self-
censorship, encryption; or, rather
ambiguously by softening the truth,
obfuscating information.
When the privacy mechanisms do not
allow the deliberate and somewhat
egocentric privacy
regulation (i.e. when individuals do not
have at least partial control), other
regulation behaviors
come into play. The individual might
engage in interpersonal communication
or even deliberation
to negotiate and interdependently
regulate privacy. Interpersonal
communication, deliberation,
and control are meta-level regulation
behaviors that come into play when
privacy behaviors are
not intuitive, when contexts collapse, or
when a certain situation demands
further elaboration
and/or communication.
The communication process shown in
Figure 1 will take turns in a constant
flow of
assessments and re-assessments as soon
as the reactions of others lead to
changes in conditions or
when personal goals or needs change.
In what follows, I will discuss the
capabilities and
consequences of the model’s theoretical
propositions. What are the possible
effects and what are
the pitfalls and opportunities that will
occur if communication, trust, and
norms are substituted
for control?
Challenging the Social Media Privacy
Model
Social media use is at the heart of
human communication and offers all of
its many merits
such as practicing freedom of speech or
reaching out to people in critical living
conditions and
providing them with social support. In
addition, social media communication
offers particular
benefits because it is ubiquitous and
independent of time and location. As
such, for example, it
THE SOCIAL MEDIA PRIVACY
MODEL 24
allows people to communicate across
borders into the lands of friends, foes,
and fiends. All of
these merits of online communication
give its users enormous freedom. This
freedom is—and
this is again one of the major merits of
social media communication—often
independent of the
domestic or cultural situation of the
user. Freedom is historically closely
linked to privacy. In
mediaeval times, only those who had
land were free (Moore, 1984). And
only those who
possessed land had the freedom to
withdraw, grant, or restrict access to
their land. In turn, the
“unfree,” who did not have their own
land or possessions, did not have the
right to privacy. In
this sense, freedom is the origin both in
legislation and the genealogy of privacy
(Gusy, 2018).
For social media and online realms,
freedom and privacy are closely
connected. However,
the ideas of possessions and ownership
do not seem fully applicable anymore.
Borders and
possessions have become fuzzy
because, very often, data and personal
information are perceived
as shared goods as soon as they appear
online. Nissenbaum (2010)
summarized her work on
contextual integrity with the words:
“We have a right to privacy, but it is
neither a right to control
personal information nor a right to
have access to this information
restricted” (p. 231). She
conceded that for social media, privacy
is rather a value and a perception.
Borders and possessions have a
permanent nature. By contrast, values
and perception are
subject to interpretation and change.
This understanding of privacy as
subject to interpretation
and change makes it a matter of
communication. Eventually,
communication about privacy will
result in trust, and if successfully
shared in a society, in social and
legislative norms. However,
communication cannot be understood
as a long and painful process that will
finally show the
solution and lead us into the light of
privacy. Just the opposite is the case. In
social media,
communication about data sharing and
the use and flow of interpersonal data
is the solution itself.
Only due to ongoing and critical
assessment, reassessment, and dynamic
communication will we
have the chance to ensure privacy as
one of the most important values of
civilized societies.
THE SOCIAL MEDIA PRIVACY
MODEL 25
Privacy’s interdependence is expressed
and put forward by interpersonal
communication.
And in fact, privacy is experiencing a
“social turn”, in social media and
beyond (Helm &
Eichenhofer, 2019). This emerging
social turn in privacy theory is
acknowledged in the social
media privacy model. However, there
are downsides to a conception of
privacy as a
communicative process. First, not all
members of this communicative
process will have equal
chances of being heard. For example,
digital and literacy gaps have been
shown to crucially
influence participation in these
communication processes (Helsper,
2017).
Second, online information has become
a commodified good, and financial
interests are
stark (Sevignani, 2016). Purposeless
communication among friends is
increasingly exploited for
economic reasons (Seubert & Becker,
2019); and, companies do their best to
avoid interactions
between individual users who strive to
regulate their privacy. In turn, they try
to establish trust
through strong branding activities that
have been shown to override social
media users’ privacy
concerns and their interest in solving
privacy issues by actively
communicating and participating
(Boerman, Kruikemeier, & Zuiderveen
Borgesius, 2018; Li, 2011).
Third, as a consequence, the
requirements of communication and
trust demand a lot from
users. Control implies a settled state in
which the individual person can lie
back and stop thinking
about the flow of personal information
online. Communication, trust, and
norms, by contrast, are
subject to change and thus require
constant assessment and consideration.
Hence, information
control should also be considered with
regard to an individual’s ability to
exert control (Grimm &
Bräunlich, 2015). The users-centered
perspective needs to be complemented
and accompanied
with a system based perspective and
respective interventions (Schäwel,
2020).
Fourth, this demanding process of
communication might also result in a
threat to the
ability to develop a self-determined
identity. Westin (1967) held that
boundary control means
identity control: “This deliberate
penetration of the individual’s
protective shell, his
THE SOCIAL MEDIA PRIVACY
MODEL 26
psychological armor, would leave him
naked to ridicule and shame and would
put him under the
control of those who knew his secrets”
(p. 33). As a consequence, lost control
would mean
threats to identity development.
Finally, I embrace the demanding and
somewhat stressful nature of
communicating about
privacy in social media. In social
media, there is only limited control
over personal information.
In addition, the handling of this lack of
control is demanding and stressful. By
acknowledging
these two observations, users will
acknowledge that they need to take
action, engage in
communication, and establish trust and
shared social and legislative norms. In
social media,
privacy is not a private affair. It is at
the center of communication. We are
out of control because
we have so much to share. Hence,
interpersonal communication, trust,
and norms are the three
most important mechanisms that
interdependently help to ensure social
media privacy.
THE SOCIAL MEDIA PRIVACY
MODEL 27
References
Altman, I. (1975). The environment
and social behavior: Privacy, personal
space, territory,
crowding. Monterey, CA:
Brooks/Cole Publishing Company.
Anderson, J., Rainie, L., & Duggan,
M. (2014). Digital life in 2025.
Retrieved from
http://www.pewinternet.org/
2014/03/11/digital-life-in-2025/
Ben-Ze'ev, A. (2003). Privacy,
emotional closeness, and openness in
cyberspace. Computers
in Human Behavior, 19(4), 451–567.
https://doi.org/10.1016/S0747-
5632(02)00078-X
Boerman, S. C., Kruikemeier, S., &
Zuiderveen Borgesius, F. J. (2018).
Exploring
motivations for online privacy
protection behavior: Insights from
panel data.
Communication Research, 25.
https://doi.org/10.1177/009365021880
0915
Boyd, d. (2008). Taken out of context.
American teen sociality in networked
publics (Doctoral
dissertation). University of California,
Berkeley.
Boyd, d. (2014). It's complicated. The
social lives of networked teens. New
Haven, CT: Yale
University Press.
Brandimarte, L., Acquisti, A., &
Loewenstein, G. (2013). Misplaced
confidences: Privacy and
the control paradox. Social
psychological and personality science,
4(3), 340–347.
https://doi.org/
10.1177/1948550612455931
Brummett, E. A., & Steuber, K. R.
(2015). To reveal or conceal? Privacy
management
processes among interracial romantic
partners. Western Journal of
Communication, 79(1),
22–44.
https://doi.org/10.1080/10570314.201
4.943417
Burgoon, J. K. (1982). Privacy and
communication. Communication
Yearbook, 6(4), 206–
249. https://doi.org/10.1080/23808985
Burkhalter, S., Gastil, J., & Kelshaw,
T. (2002). A conceptual definition and
theoretical model
of public deliberation in small face-to-
face groups. Communication Theory,
12(4), 398–
422.
https://doi.org/10.1093/ct/12.4.398
Buss, A. (2001). Psychological
dimensions of the self. Thousand
Oaks, CA: Sage.
THE SOCIAL MEDIA PRIVACY
MODEL 28
Carr, C. T., & Hayes, R. A. (2015).
Social media: defining, developing,
and divining. Atlantic
Journal of Communication, 23(1), 46–
65.
https://doi.org/10.1080/15456870.201
5.972282
Cho, H., Lee, J.-S., & Chung, S.
(2010). Optimistic bias about online
privacy risks: Testing
the moderating effects of perceived
controllability and prior experience.
Computers in
Human Behavior, 26, 987–995.
https://doi.org/10.1016/j.chb.2010.02.
012
Crowley, J. L. (2017). A framework
of relational information control: A
review and extension
of information control research in
interpersonal contexts.
Communication Theory, 27(2),
202–222.
https://doi.org/10.1111/comt.12115
Derlega, V. J., Metts, S., Petronio, S.,
& Margulis, S. T. (1993). Self-
disclosure. Sage series
on close relationships. Newbury Park,
CA: Sage Publications.
Dienlin, T. (2014). The privacy
process model. In S. Garnett, S. Halft,
M. Herz, & J. M.
Mönig (Eds.), Medien und Privatheit
[Media and privacy] (pp. 105–122).
Passau,
Germany: Karl Stutz.
Eastin, M. S., Brinson, N. H., Doorey,
A., & Wilcox, G. (2016). Living in a
big data world:
predicting mobile commerce activity
through privacy concerns. Computers
in Human
Behavior, 58, 214–220.
https://doi.org/10.1016/j.chb.2015.12.
050
Eichenhofer, J. (2019). e-Privacy -
Theorie und Dogmatik eines
europäischen
Privatheitsschutzes im Internet-
Zeitalter [Theoretical and doctrinal
foundations of a
European privacy protection
regulation in the internet age].
Bielefeld: University of
Bielefeld.
Ellison, N. B., & boyd, d. (2013).
Sociality through social network sites.
In W. H. Dutton
(Ed.), The Oxford handbook of
Internet studies (pp. 151–172).
Oxford, UK: Oxford
University Press.
European Commission. (2015).
Special Eurobarometer 431: Data
protection. Brussels, BE.
Retrieved from
http://ec.europa.eu/public_opinion/arc
hives/ebs/ebs_431_en.pdf
THE SOCIAL MEDIA PRIVACY
MODEL 29
Evans, S. K., Pearce, K. E., Vitak, J.,
& Treem, J. W. (2017). Explicating
affordances: A
conceptual framework for
understanding affordances in
communication research. Journal
of Computer-Mediated
Communication, 22(1), 35–52.
https://doi.org/10.1111/jcc4.12180
Fox, J., & McEwan, B. (2017).
Distinguishing technologies for social
interaction: The
perceived social affordances of
communication channels scale.
Communication
Monographs, 84(3), 298–318.
https://doi.org/10.1080/03637751.201
7.1332418
Gibson, J. J. (2014). The ecological
approach to visual perception.
Psychology Press &
Routledge Classic Editions. Hoboken,
NJ: Taylor and Francis (Original work
published
1979).
Grimm, R., & Bräunlich, K. (2015).
Vertrauen und Privatheit [Trust and
privacy]. DuD
Datenschutz und Datensicherheit
[Data protection and data security], 5,
289–294.
Gusy, C. (2018). Datenschutz als
Privatheitsschutz oder Datenschutz
statt Privatheitsschutz?
[Data protection as privacy protection
or privacy protection as data
protection?].
Europäische Grundrechte Zeitschrift
[European Fundamental Rights
Journal], 45(9-12),
244–255.
Helm, P., & Eichenhofer, C. (2019).
Reflexionen zu einem social turn in
den privacy studies.
In C. Aldenhoff, L. Edeler, Hennig,
M., Kelsch, J., L. Raabe, & F. Sobala
(Eds.),
Digitalität und Privatheit [Digitality
and Privacy] (pp. 139–166). Bielefeld,
Germany:
transcript.
https://doi.org/10.14361/97838394466
14-009
Helsper, E. J. (2017). The social
relativity of digital exclusion:
Applying relative deprivation
theory to digital inequalities.
Communication Theory, 27(3), 223–
242.
https://doi.org/10.1111/comt.12110
Howard, P. N., & Parks, M. R. (2012).
Social media and political change:
Capacity,
constraint, and consequence. Journal
of Communication, 62(2), 359–362.
https://doi.org/10.1111/j.1460-
2466.2012.01626.x
THE SOCIAL MEDIA PRIVACY
MODEL 30
Johnson, C. A. (1974). Privacy as
personal control. In S. T. Margulis
(Ed.), Man-environment
interactions: Evaluations and
applications (pp. 83–100).
Stroudsburg, PA: Dowden,
Hutchinson & Ross.
Joinson, A. N. (2001). Self-disclosure
in computer-mediated
communication: The role of self-
awareness and visual anonymity.
European Journal of Social
Psychology, 31(2), 177–192.
https://doi.org/10.1002/ejsp.36
Laufer, R. S. [R. S.], & Wolfe, M.
(1977). Privacy as a concept and a
social issue: A
multidimensional developmental
theory. Journal of Social Issues, 33(3),
22–42.
https://doi.org/10.1111/j.1540-
4560.1977.tb01880.x
Li, Y. (2011). Empirical studies on
online information privacy concerns:
Literature review
and an integrative framework.
Communications of the Association
for Information Systems,
28(1), 453–496. Retrieved from
http://aisel.aisnet.org/cais/vol28/iss1/2
8
Madden, M. (2014). Public
perceptions of privacy and security in
the post-Snowden era.
Retrieved from
http://www.pewinternet.org/2014/11/1
2/public-privacy-perceptions/
Madden, M., & Rainie, L. (2015).
Americans’ attitudes about privacy,
security and
surveillance. Retrieved from
http://www.pewinternet.org/2015/05/2
0/americans-attitudes-
about-privacy-security-and-
surveillance/
Marwick, A. E., & boyd, d. (2014).
Networked privacy. How teenagers
negotiate context in
social media. New Media & Society,
16(7), 1051–1067.
https://doi.org/
10.1177/1461444814543995
Masur, P. K. (2019). Situational
privacy and self-disclosure:
Communication processes in
online environments. Cham,
Switzerland: Springer International
Publishing.
Metzger, M. J. (2004). Privacy, trust,
and disclosure: Exploring barriers to
electronic
commerce. Journal of Computer-
Mediated Communication, 9(4).
https://doi.org/10.1111/j.1083-
6101.2004.tb00292.x
THE SOCIAL MEDIA PRIVACY
MODEL 31
Moor, J. H. (1997). Towards a theory
of privacy in the information age.
ACM SIGCAS
Computers and Society, 27(3), 27–32.
https://doi.org/10.1145/270858.27086
6
Moore, B. (1984). Privacy: Studies in
social and cultural history. Armonk,
N.Y.: M.E.
Sharpe.
Nissenbaum, H. (2010). Privacy in
context: Technology, policy, and the
integrity of social
life. Palo Alto, CA: Stanford
University Press.
Ochs, C., & Büttner, B. (2018). Das
Internet als "Sauerstoff" und
"Bedrohung" [The internet
as oxygen and menace]. In M.
Friedewald (Ed.), DuD-Fachbeiträge.
Privatheit und
selbstbestimmtes Leben in der
digitalen Welt [Privacy and a self-
determined life in a
digital world] (pp. 33–80).
Wiesbaden, Germany: Springer
Vieweg.
Papacharissi, Z. (2010). A private
sphere: Democracy in a digital age.
Cambridge: Polity
Press.
Pedersen, D. M. (1999). Model for
types of privacy by privacy functions.
Journal of
Environmental Psychology, 19, 397–
405.
https://doi.org/10.1006/jevp.1999.014
0
Petronio, S. (2002). Boundaries of
privacy. Albany, NY: State University
of New York Press.
Qian, H., & Scott, C. R. (2007).
Anonymity and self-disclosure on
weblogs. Journal of
Computer-Mediated Communication,
12(4), 1428-1451.
https://doi.org/10.1111/j.1083-
6101.2007.00380.x
Quinn, K. (2014). An ecological
approach to privacy: “Doing” online
privacy at midlife.
Journal of Broadcasting & Electronic
Media, 58(4), 562–580.
https://doi.org/
10.1080/08838151.2014.966357
Quinn, K. (2016). Why we share: A
uses and gratifications approach to
privacy regulation in
social media use. Journal of
Broadcasting & Electronic Media,
60(1), 61–86.
https://doi.org/
10.1080/08838151.2015.1127245
THE SOCIAL MEDIA PRIVACY
MODEL 32
Rainie, L., Kiesler, S., Kang, R., &
Madden, M. (2013). Anonymity,
privacy, and security
Online. Retrieved from
http://www.pewinternet.org/2013/09/0
5/anonymity-privacy-and-
security-online/
Ramirez, A., Bryant, S., Erin, M.,
Fleuriet, C., & Cole, M. (2015). When
online dating
partners meet offline: The effect of
modality switching on relational
communication
between online daters. Journal of
Computer-Mediated Communication,
20(1), 99–114.
https://doi.org/10.1111/jcc4.12101
Saeri, A. K., Ogilvie, C., La Macchia,
S. T., Smith, J. R., & Louis, W. R.
(2014). Predicting
facebook users' online privacy
protection: Risk, trust, norm focus
theory, and the theory of
planned behavior. The Journal of
Social Psychology, 154(4), 352–369.
https://doi.org/
10.1080/00224545.2014.914881
Sarikakis, K., & Winter, L. (2017).
Social media users’ legal
consciousness about privacy.
Social Media + Society, 3(1), 1-14.
https://doi.org/10.1177/205630511769
5325
Schäwel, J. (2020). How to raise
users’ awareness of online privacy.
Duisburg, Germany:
University of Duisburg-Essen.
Schmidt, J.-H. (2014). Twitter and the
rise of personal publics. In K. Weller,
A. Bruns, J.
Burgess, M. Mahrt, & C. Puschmann
(Eds.), Digital formations: Vol. 89.
Twitter and
society (pp. 3–14). New York: Peter
Lang.
Seubert, S., & Becker, C. (2019). The
culture industry revisited:
Sociophilosophical
reflections on ‘privacy’ in the digital
age. Philosophy & Social Criticism,
45(8), 930–947.
https://doi.org/
10.1177/0191453719849719
Sevignani, S. (2016). Privacy and
capitalism in the age of social media.
Routledge research
in information technology and society:
Vol. 18. New York, NY: Routledge.
Slater, M. D. (2007). Reinforcing
spirals: The mutual influence of media
selectivity and
media effects and their impact on
individual behavior and social
identity. Communication
Theory, 17(3), 281–303.
https://doi.org/10.1111/j.1468-
2885.2007.00296.x
THE SOCIAL MEDIA PRIVACY
MODEL 33
Smith, H. J., Dinev, T., & Xu, H.
(2011). Information privacy research:
an interdisciplinary
review. Mis Quarterly, 35(4), 989–
1016.
Spottswood, E. L., & Hancock, J. T.
(2017). Should I share that?
Prompting social norms that
influence privacy behaviors on a
social networking site. Journal of
Computer-Mediated
Communication, 22(2), 26.
https://doi.org/10.1111/jcc4.12182
Taneja, A., Vitrano, J., & Gengo, N. J.
(2014). Rationality-based beliefs
affecting individual’s
attitude and intention to use privacy
controls on facebook: An empirical
investigation.
Computers in Human Behavior, 38,
159–173.
https://doi.org/10.1016/j.chb.2014.05.
027
Tavani, H. T. (2007). Philosophical
theories of privacy: Implications for
an adequate online
privacy policy. Metaphilosophy,
38(1), 1–22.
https://doi.org/10.1111/j.1467-
9973.2006.00474.x
Tavani, H. T., & Moor, J. H. (2001).
Privacy protection, control of
information, and privacy-
enhancing technologies. ACM
SIGCAS Computers and Society,
31(1), 6–11.
https://doi.org/
10.1145/572277.572278
Teutsch, D., Masur, P. K., & Trepte,
S. (2018). Privacy in mediated and
nonmediated
interpersonal communication: How
subjective concepts and situational
perceptions
influence behaviors. Social Media +
Society, 4(2), 1-14.
https://doi.org/
10.1177/2056305118767134
Treem, J. W., & Leonardi, P. M.
(2012). Social media use in
organizations. Exploring the
affordances of visibility, editability,
persistence, and association.
Communication
Yearbook, 36, 143–189.
https://doi.org/10.1080/23808985.201
3.11679130
Trepte, S., & Masur, P. K. (2017).
Need for privacy. In Zeigler-Hill, V.,
Shakelford, T. K.
(Ed.), Encyclopedia of personality and
individual differences. London, UK:
Springer.
https://doi.org/10.1007/978-3-319-
28099-8_540-1
Trepte, S., & Reinecke, L. (Eds.).
(2011). Privacy online. Perspectives
on privacy and self-
disclosure in the social web. Berlin,
Germany: Springer.
THE SOCIAL MEDIA PRIVACY
MODEL 34
Trepte, S., Reinecke, L., Ellison, N.
B., Quiring, O., Yao, M. Z., &
Ziegele, M. (2017). A
cross-cultural perspective on the
privacy calculus. Social Media +
Society, 3(1), 1-13.
https://doi.org/
10.1177/2056305116688035
Tsay-Vogel, M., Shanahan, J., &
Signorielli, N. (2018). Social media
cultivating perceptions
of privacy: A 5-year analysis of
privacy attitudes and self-disclosure
behaviors among
Facebook users. New Media &
Society, 20(1), 141–161.
https://doi.org/
10.1177/1461444816660731
Utz, S., & Krämer, N. (2009). The
privacy paradox on social network
sites revisited. The role
of individual characteristics and group
norms. Journal of Psychosocial
Research on
Cyberspace, 3(2). Retrieved from
http://cyberpsychology.eu/view.php?
cisloclanku=2009111001&article=2
Vitak, J. (2012). The impact of
context collapse and privacy on social
network site
disclosures. Journal of Broadcasting &
Electronic Media, 56(4), 451–470.
https://doi.org/
10.1080/08838151.2012.732140
Waldman, A. E. (2018). Privacy as
Trust. Cambridge, UK: Cambridge
University Press.
https://doi.org/
10.1017/9781316888667
Walther, J. B. (1996). Computer-
mediated communication. Impersonal,
interpersonal, and
hyperpersonal interaction.
Communication Research, 23(1), 3–
43.
https://doi.org/
10.1177/009365096023001001
Warren, S. D., & Brandeis, L. D.
(1890). The right to privacy. Harvard
Law Review, 4(5),
193–220.
Westin, A. F. (1967). Privacy and
freedom. New York, NY: Atheneum.
Wolfe, M., & Laufer, R. (1974). The
concept of privacy in childhood and
adolescence. In S.
T. Margulis (Ed.), Man-environment
interactions: Evaluations and
applications (pp. 29–
54). Stroudsburg, PA: Dowden,
Hutchinson & Ross.
THE SOCIAL MEDIA PRIVACY
MODEL 2
Abstract
Privacy has been defined as the
selective control of information
sharing, where control is key. For
social media, however, an individual
user’s informational control has
become more difficult. In
this theoretical article, I review how
the term control is part of theorizing on
privacy, and I
develop an understanding of online
privacy with communication as the
core mechanism by which
privacy is regulated. The results of this
article’s theoretical development are
molded into a
definition of privacy and the social
media privacy model. The model is
based on four
propositions: Privacy in social media is
interdependently perceived and valued.
Thus, it cannot
always be achieved through control. As
an alternative, interpersonal
communication is the
primary mechanism by which to ensure
social media privacy. Finally, trust and
norms function as
mechanisms that represent crystallized
privacy communication. Further
materials are available at
https://osf.io/xhqjy/
Keywords: privacy, control, social
media, affordances, communication,
social media
privacy model, definition of privacy
THE SOCIAL MEDIA PRIVACY
MODEL 3
The Social Media Privacy Model:
Privacy and Communication in the
Light of Social Media
Affordances
In historical and current theories about
privacy, control has been perceived as
an
important defining term. The majority
of privacy scholars understand control
as the means by
which to regulate and finally
experience privacy (Altman, 1975;
Burgoon, 1982; Petronio, 2002).
The underlying assumption is that the
more users can control access to their
personal lives, or—
more technically—to their data, the
more privacy they experience. Also, the
most current
understanding held by social media
users is that they need control to
achieve privacy and
informational self-determination
(Marwick & boyd, 2014). The majority
of 80% to 90% of U.S.
Americans (Madden & Rainie, 2015)
and Europeans (European Commission,
2015) say that it is
important to them to be in control of
determining who can obtain
information about them and
what information is collected about
them (see also Sarikakis & Winter,
2017).
There is no question that users face
decreasing informational control while
communicating via social media. Due
to their networked nature, social media
applications do not
allow users to control what friends,
acquaintances, institutions, or
companies do with the
information, pictures, and stories that
are shared online (Marwick & boyd,
2014). Further, social
media communication takes place in
larger and larger parts of users’ lives.
All of the more
current applications and devices
aggregate information and exert
automatic control (Anderson,
Rainie, & Duggan, 2014). As a reaction
to the increasing amounts of data that
are exchanged and
the sociality of such data, 91% of users
perceive that they have lost control
over how their
personal information is collected and
used by friends, acquaintances, and
colleagues (Quinn,
2014) and especially by companies and
governments (Madden, 2014).
These two observations—the
understanding of privacy as control on
the one hand and the
experience of decreasing control over
information while using social media
on the other—can be
THE SOCIAL MEDIA PRIVACY
MODEL 4
termed a control issue of privacy. In the
remainder of this article, I will suggest
an understanding
of privacy that is adequate for social
media use and the requirements
emerging from this issue.
The Relationship of Privacy and
Control
Privacy is a concept that has been
considered and defined in very
different disciplines,
from descriptive, empirical, and
normative perspectives (Sevignani,
2016; Trepte & Reinecke,
2011). In earlier days, privacy was
considered a human right and identified
as the “right to be let
alone” (Warren & Brandeis, 1890, p.
75). Later and more specifically,
privacy was defined as
“the claim of individuals, groups, or
institutions to determine for themselves
when, how, and to
what extent information about them is
communicated to others” (Westin,
1967, p. 7) or “the
selective control of access to the self”
(Altman, 1975, p. 24).
Informational control has only seldom
been defined, but the most common
definitions
touch either a static or behavioral
aspect of control: Informational control
foremost means that
owners of a certain piece of
information have a choice over
whether, when, and to what extent
they will disclose or withhold personal
information (Crowley, 2017; Tavani,
2007). Here control
is static, a question of more or less, yes
or no. It can be understood as an option
or an available
mechanism. Then, control can be
exerted actively (e.g., restricting access
to information,
audience segregation, self-censorship,
encryption), ambiguously (e.g.,
softening the truth,
obfuscating information, or engaging
in other forms of partial disclosure), or
passively (e.g.
unintentionally omitting information)
(Crowley, 2017; Ochs & Büttner,
2018). In this rather
behavioral understanding,
informational control is executed and
experienced by the individual
person. In both perspectives, control is
centered around the individual and
individual decision
making.
The majority of privacy theories are
devoted to two—somewhat
contradictory—
paradigms: I will call the first
paradigm “privacy as control,” because
here, privacy and control
THE SOCIAL MEDIA PRIVACY
MODEL 5
are strongly connected, and, the second
paradigm “privacy and control,”
because here, privacy
and control are treated as separate
constructs with conditional
relationships. I will then suggest a
third perspective that redefines the
meaning and impact of control and the
conditions among
which control becomes relevant. This
perspective will be summarized in the
social media privacy
model.
Paradigm 1: Privacy as Control
In the seminal work by Altman (1975)
and the privacy regulation model of
self-disclosure
(Derlega, Metts, Petronio, & Margulis,
1993), control was set forth as the
crucial mechanism of
privacy. More recent
conceptualizations have also referred to
control as a precondition of privacy
(Petronio, 2002). Even in their very
first conceptualizations of privacy,
Warren and Brandeis
(1890) referred to privacy as the right
to control what others publish about
oneself. In an
overview of privacy theories, Smith,
Dinev, and Xu (2011) investigated 448
publications on
privacy. They found that—besides an
understanding of privacy as a value—
the cognate-based
understanding of privacy as control has
dominated the social sciences.
The vast majority of privacy scholars
have referred to control as a dynamic
behavior in
the process of privacy regulation to
grant access or to deny access. Altman
(1975) suggested a
process model with three steps: First,
an individual assesses the desired level
of privacy; then the
individual eventually regulates privacy
by controlling interpersonal
boundaries; and then, the
individual again assesses the achieved
level of privacy. In his flow-model the
crucial role that
was assigned to control becomes
apparent. On the basis of this notion,
Petronio (2002) articulated
how control is the engine of privacy
management. In her understanding, an
individual jointly
manages and coordinates rules with
others while interacting with them.
Here again, control is not
only the behavior through which
privacy can be gained, but control is
also the means by which to
measure the status quo of privacy, and
in turn, it will foster the extent to which
privacy regulation
THE SOCIAL MEDIA PRIVACY
MODEL 6
is further engaged in through an
exertion of control.
Privacy scholars have also referred to
the question of what is being
controlled. Here, in
particular, the control of access to
boundaries and the control of the flow
of an interaction were
addressed as the topics or processes
that needed to be controlled (Johnson,
1974; Wolfe &
Laufer, 1974). Further, control over
stimuli that impinge upon a person
were articulated as things
that need to be controlled (Wolfe &
Laufer, 1974). Margulis (1974)
explained that control refers
to all matters being exchanged between
individuals: “Privacy, as a whole or in
part, represents the
control of transactions between
person(s) and other(s)…” (p. 77).
In some theories, control has been used
almost interchangeably with privacy.
For
example, Derlega et al. (1993) stated
that “…privacy represents control over
the amount and kind
of information exchange that persons
have with one another” (p. 67). Then,
Burgoon (1982)
differentiated between four dimensions
of privacy, all of which refer to how
much control an
individual has: Possessing physical
privacy refers to whether and how
much control an individual
perceives to have over physical
boundaries. Social privacy refers to
how much control an
individual perceives to have over the
access of others to the person’s
environments.
Psychological privacy refers to how
much control an individual perceives to
have over emotional
and cognitive input and output. Finally,
informational privacy refers to how
much control an
individual perceives to have over the
use of personal data. In this
conceptualization, the ability to
exert control is the key to an
individual’s privacy perception and, in
turn, regulation. Many
empirical studies have addressed the
relationship between control and
privacy, but only a
minority of studies have supported the
notion that privacy behavior is related
to informational
control (Brandimarte, Acquisti, &
Loewenstein, 2013).
In sum, studies that have been based on
this first paradigm have underscored
the idea that
individuals exert control to achieve
privacy. In turn, privacy should be
achieved if a certain level
THE SOCIAL MEDIA PRIVACY
MODEL 7
of control is successfully executed and
maintained as the status quo. However,
these
conceptualizations of privacy suggest a
linear relationship between privacy and
control. They
assume that “…the more one has
control over this information exchange,
the greater the amount
of privacy one has in a social
relationship” (Derlega et al., 1993, p.
67). However, previous
empirical research did not find a linear
relationship between privacy and
control. Hence, there is
a mismatch between the theoretical
assumption that privacy and control are
closely related on the
one hand and the rare empirical data
supporting this notion on the other.
Paradigm 2: Privacy and Control
In social media, an individual person
cannot easily execute control because
personal
information is exchanged between
many parties and with a broad range of
applications. Users
experience social media as more
confusing, demanding, and complex
with regard to the control
that they have over their personal
information than face-to-face
communication (Marwick
& boyd, 2014; Quinn, 2014). Woo
(2016) expressed this confusion while
mimicking the
presumed thoughts of a user: “Please
do contact me and give me benefits, but
I still do not want
to fully give up my control (but I do
not know how to have that control)” (p.
954). In other
words, users want to take advantage of
the networked nature of social media,
are painfully aware
of the deficits in control, but have not
yet found solutions for how to embrace
their needs for both
gratification and informational control.
This process of weighing privacy risks
and social
gratifications has also been
investigated under the umbrella of the
privacy calculus (Trepte et al.,
2017).
To follow up on the sentiment that
users wish to have informational
control but that
control seems to contradict the
networked nature of social media,
Moor (1997) and later Tavani
(2007) reflected on the relationship
between privacy and control. They
argued that control and
privacy should be seen as separate
constructs and that privacy and control
serve very different
THE SOCIAL MEDIA PRIVACY
MODEL 8
functions. With the Restricted
Access/Limited Control (RALC) theory
of privacy, these authors
defined privacy in terms of the
individual’s protection from intrusion
and information gathering
by third parties. They argued that
control in the information age is
impossible and further that
“We can have control but no privacy,
and privacy but no control” (Tavani &
Moor, 2001, p. 6).
They suggested that privacy and control
should be separated such that privacy is
a concept and a
value that is defined by being protected
from information access by others,
whereas control is one
mechanism that can be used to manage
and justify privacy. Control may be
exerted through
choice, consent, or correction. In the
flow of the exchange of digital
information, people choose
situations according to their
communication goals, level of access,
and emerging privacy needs
(Trepte & Masur, 2017); then, privacy
is maintained through the processes of
consent, and
finally, corrections allow people to
restore their privacy when it gets lost or
threatened. For the
upcoming social media privacy model,
I will refer to this notion that control is
one mechanism
among others, and I will explain that
for all processes (i.e., choice, consent,
correction),
individuals have to get in touch with
others and communicate their motives
and aims.
With her theory of contextual integrity,
Nissenbaum (2010) also addressed the
contextual
requirements as boundary conditions,
regardless of whether control is a
functional mechanism or
not. She suggested that the two sets of
theories be married: those referring to
privacy as a
constraint on access and those referring
to privacy as a form of control. In her
theory of
contextual integrity, Nissenbaum
(2010) described control as one
“transmission principle” (p.
145) that defines how information is
exchanged. Other transmission
principles are reciprocity and
confidentiality. Control as a
transmission principle is appropriate
only if it fits into the particular
context, the subject that users are
talking about, the type of information
that is to be exchanged,
and the actors they communicate with.
From this point of view, there can be
privacy without
control in situations in which control is
inappropriate or not available (Laufer
& Wolfe, 1977;
THE SOCIAL MEDIA PRIVACY
MODEL 9
Slater, 2007).
Current privacy theories pushed the
idea of control as a dynamic attribute of
the situation
one crucial step further. According to
Dienlin’s (2014) privacy process
model, individuals assess
the controllability of the context and
behavior. Masur (2019) adds an
analysis of what is being
controlled by entangling interpersonal
(e.g., the relationship between
interaction partners) and
external factors (e.g., the architecture of
a room) in his theory of situational
privacy and self-
disclosure. Depending on the situation,
these interpersonal and external factors
can be controlled
to different degrees, and in turn, they
can elicit differential levels of self-
disclosure. Self-
disclosure will be understood as “the
intentional communication of
information about the self to
another person or group of people”
(Masur 2019, p. 70) in the remainder of
this article.
The notion that privacy and control are
not necessarily connected has been
supported by
previous research (Saeri, Ogilvie, La
Macchia, Smith, & Louis, 2014). For
example, Zlatolas,
Welzer, Heričko, and Hölbl (2015)
demonstrated that privacy norms,
policies, and awareness but
not privacy control were related to the
self-disclosures of N = 673 Slovenian
Facebook users. In
a U.S. sample of N = 249 Facebook
users, Taneja, Vitrano, and Gengo
(2014) found that
perceived behavioral control and the
intention to engage in privacy-related
behavior were
unrelated. Eastin et al. (2016)
investigated how different variables
predicted mobile commerce
activity and found that control was the
one that explained the smallest amount
of variance. In
particular, trust and attitude toward
mobile commerce were more important
predictors than
control. In sum, individuals who
perceived that they had control over
their personal data did not
necessarily feel they had more privacy
and did not increasingly engage in self-
disclosure. Further,
trust and norms were identified as
important alternative mechanisms of
privacy (Brandimarte et
al., 2013; Eastin et al., 2016;
Nissenbaum, 2010; Zlatolas et al.,
2015). I will refer to both
findings in the social media privacy
model.
THE SOCIAL MEDIA PRIVACY
MODEL 10
The Interplay of Affordances, Control,
and Privacy
The lack of a relation between privacy
and control might hint that the interplay
of the two
variables is not linear and is actually
more complex (Laufer & Wolfe, 1977).
The relation
between control and privacy should
become clearer if the social media
boundary conditions that
make control a functional mechanism
in one situation but impractical in
another are elucidated.
Social Media and its Boundary
Conditions for Privacy
Carr and Hayes (2015) defined social
media as “…Internet-based channels
that allow
users to opportunistically interact and
selectively self-present, either in real-
time or
asynchronously, with both broad and
narrow audiences who derive value
from user-generated
content and the perception of
interaction with others” (p. 50). They
further pointed out that users’
interaction will increasingly be
influenced by social media affordances.
Further, social media has
been characterized by its content, its
users, and its infrastructure in previous
definitions (Howard
& Parks, 2012). The most prominent
examples of social media are social
network sites (e.g.,
Facebook, Instagram, LinkedIn,
Google+), multimedia platforms (e.g.,
Youtube, Slideshare,
Soundcloud), weblogs (e.g., personal
diaries of mothers, scholars, or self-
appointed or paid
influencers), and microblogs (e.g.,
Twitter). In struggling to develop a
definition of social media,
scholars have pointed to the fact that
social media channels are formally
understood as methods
of mass communication but that they
primarily contain and perpetuate
personal user interactions
(Carr & Hayes, 2015; Papacharissi,
2010). In this sense, social media can
be referred to as
personal publics (Schmidt, 2014). As a
consequence, users cannot always
clearly define the
somewhat blurred lines between
personal and public or between private
and professional
communication. They feel that contexts
collapse and converge (Papacharissi,
2010; Vitak, 2012).
In sum, social media is characterized
by the following boundary conditions:
the content, its flow
and further uses (Howard & Parks,
2012); the communication practices
that users perceive as
THE SOCIAL MEDIA PRIVACY
MODEL 11
their options for exerting control or for
achieving privacy with other means;
and social media
affordances (Carr & Hayes, 2015). In
the following, I will analyze of how the
interplay of these
boundary conditions is related to
control and how it determines different
privacy perceptions and
behaviors. Tables 1 and 2 in the
Supplemental summarize this
theoretical development.
Social Media Boundary Condition 1:
Content, its Flow and Uses
What exactly does an individual strive
to control? The sooner we come to
understand
what individuals strive to control, the
better we can evaluate whether control
can be experienced
in social media. According to most
studies, personal information refers to
the content that people
strive to control in order to maintain
their privacy in social media. Metzger
(2004) referred to
personal information as the content to
be controlled. Quinn (2014) suggested
different layers of
how privacy can be maintained. On the
“content layer,” users’ experience of a
lack of control
leads them to limit the information they
post or even to post false information.
Sarikakis and
Winter (2017) added on the basis of
their qualitative work that users do not
differentiate between
personal information and personal data.
Instead, they define the degree of
intimacy or privacy
needed for a certain piece of
information or data.
Then, besides personal information, the
flow and use of the content needs to be
considered. Social media advocates
specifically address where online
information is forwarded,
archived, and sold. They emphasize
users’ concerns about how little control
they have over the
flow and use of personal information
(Marwick & boyd, 2014; Quinn, 2014;
Tsay-Vogel,
Shanahan, & Signorielli, 2018). This
refers to the forms personal
information takes, to where it
ends up and how it is used. In the
following, personal information, its
flow, and further uses will
be considered as what users strive to
control.
Social Media Boundary Condition 2:
Practices of Control
Actively exerting control expresses
informational self-determination,
which implies
THE SOCIAL MEDIA PRIVACY
MODEL 12
having power over information and
agency in decisions regarding this
information. In turn, a loss
of control would mean that other
behavioral options are out of reach and
that individuality (Buss,
2001), power, and agency are severely
threatened (Brummett & Steuber,
2015). Control also
comes along with risk avoidance:
Users have identified the most
important pieces of information
that they want to control as the contents
of their emails, the contents of their
online chats, and
their location (Cho, Lee, & Chung,
2010). As long as they have control
over this information,
they can avoid being harassed, bullied,
financially exploited by companies, or
surveilled by
governmental institutions.
How is control executed and achieved?
First, informational control can be
identified as an
individual’s perception that he or she
has a choice about whether to withhold
or disclose
information (Crowley, 2017; Johnson,
1974). Choice is the first step and
determines whether
control can be exerted and to what
degree (Wolfe & Laufer, 1974). Then,
in the next step, when
choice is available, it has to be put into
action. Two active practices of control
in social media are
consent and correction (Tavani, 2007).
Consent refers to the extent to which
users agree that a
certain piece of information will be
passed along. Correction means that
users are able to
withdraw from this agreement.
Whereas choice was identified long
ago as a practice of control
(Johnson, 1974), consent and correction
were suggested as social media
practices (Tavani, 2007).
Further, informational control can also
be put into practice by the selective
sharing of
information, self-censorship, audience
segregation, and encryption (Ochs &
Büttner, 2018). All
of these options are directed by the
individual and can be considered to be
ego-centered. It will be
important to also find terms for
interpersonal privacy regulation
behaviors.
Social Media Boundary Condition 3:
Affordances
Social media can be characterized by
affordances. The term affordance,
initially
suggested by Gibson (1979/2014),
means that the environmental
characteristics of a certain entity
THE SOCIAL MEDIA PRIVACY
MODEL 13
are not static but differently perceived,
experienced, and as such shaped by
humans. In the case of
social media, this understanding is
more than suitable. Of course the
process of shaping or
“furnishing” (Gibson, 1979/2014, p.
78) social media influences its further
uses. For example,
teenage users regulate their privacy
through social steganography, an
idiomatic language that is
understood only by their peers but not
their parents instead of managing their
audiences by
editing their friends lists and
systematically blocking their parents or
certain peers (boyd, 2014).
Inventing and using this kind of
idiomatic language might influence
users’ style of
communication and interactions on
social media. A selection of four
affordances have repeatedly
been shown to be particularly important
for social media: anonymity,
editability, association, and
persistence (boyd, 2008; Evans, Pearce,
Vitak, & Treem, 2017; Treem &
Leonardi, 2012). The
affordances of anonymity and
editability allow users to exert control.
By contrast, the affordances
of association and persistence
challenge users’ ability to exert
control. Both clusters of
affordances—those enhancing (Table
1) as well as those challenging control
(Table 2)—have
different implications for how content
is controlled, which practices of
control are available, and
how they affect privacy regulation in
social media realms. In the following, I
intertwine the
results from previous research on
privacy, the content and practices of
control, and, affordances.
Anonymity. The affordance of
anonymity describes the idea that other
social media
agents such as other private people,
institutions, or companies do not know
the source of the
message or the sender (Evans et al.,
2017). For social media use, being
unknown to others and
anonymously using social media is rare
(Rainie, Kiesler, Kang, & Madden,
2013). Nevertheless,
anonymity is still appreciated
occasionally. For example, users
commenting on other users’ posts
in self-help or political forums can
decide to keep their interactions
anonymous. In addition, users
of dating websites might at least partly
or temporarily use such sites
anonymously (Ramirez,
Bryant, Erin, Fleuriet, & Cole, 2015).
Anonymity is not a question of “on” or
“off” but is flexible
THE SOCIAL MEDIA PRIVACY
MODEL 14
and scalable (Evans et al., 2017).
Anonymity has spurred enormous
attention in research on
computer-mediated communication
(CMC; Joinson, 2001). Here,
anonymity is specifically
considered a means to control the
access of certain individuals (Qian &
Scott, 2007). Further,
while being online anonymously, an
individual can deliberately decide what
information to share,
with whom to share it, and what to
withhold (Qian & Scott, 2007). While
being online
anonymously, the receiver of a
message cannot provide the CMC with
face-to-face behaviors,
and as such, the control lies in the
hands of the sender (Ben-Ze'ev, 2003).
Control over content, its flow, and its
uses is possible because, in a state of
anonymity, all
of these are disconnected from the user.
Although full anonymity is not afforded
by social media,
U.S. American users occasionally keep
their posts anonymous while using
social media with the
clear aim of exerting control (Rainie et
al., 2013). For example, they disguise
their location or
delete cookies so that companies
cannot identify them. An interview
partner in Sarikakis and
Winters’ (2017) study said: “Well
when I use fake names or email
addresses and fake birthdates I
think that’s the only control you can
try to have” (p. 9). Two aspects,
though, might mitigate the
perception of control. First, users know
that they leave traces behind, and once
they have posted
content online—even if it was posted
anonymously—they might be traceable
because of other
information they left online; second,
anonymity is usually used only to some
extent (e.g., by
leaving one’s name but not one’s
address), and users acknowledge that
with partial anonymity,
they experience only partial control,
and in turn, only partial privacy (Rainie
et al., 2013). What
are the available practices users have to
exert control? First, being online
anonymously can be
considered a question of choice. Woo
(2016) suggested that anonymity—or
by contrast,
identifiability—is the most important
issue in online privacy. He argued that
on the social web,
users should have “islands” of
anonymity. He has even encouraged
people to lie and to have
secrets with the aim of regaining
control and autonomy. Then, however,
in an anonymous setting,
THE SOCIAL MEDIA PRIVACY
MODEL 15
consent and corrections are somewhat
disconnected from the user as these are
identity-related
practices.
Anonymity can be an enactment of
control by disguising or lying about
one’s identity or
by leaving it unspecified for some
applications and occasions. Also,
people may leave the source
of their own messages unknown. And,
of course, these enactments of control
might be applied
when interacting with some users but
not with others. In sum, the affordance
of anonymity is
related to informational control, which
has also been demonstrated in
empirical studies (Fox &
McEwan, 2017).
In previous research on privacy,
anonymity has played a crucial role.
Here, it was even
understood as a “type” of privacy
among other types such as solitude,
intimacy, or reserve
(Pedersen, 1999; Westin, 1967). In
sum, anonymity allows users to exert
control and, in turn, it
increases an individual’s subjective
experience of privacy by not being
identifiable at all or by
selectively presenting parts of one’s
own identity (Smith et al., 2011; Woo,
2016).
Editability. While using social media,
users interact remotely in terms of time
and space.
This gives them the chance to edit their
online communications before and
after the
communications are seen by others
(Treem & Leonardi, 2012). Editability
is an affordance that
was previously addressed as important
for CMC in the hyperpersonal model
(Walther, 1996):
Senders of online messages self-
selectively present themselves online
by primarily transmitting
cues and messages that they want
others to get to know and that put them
in a positive light. In
addition, although editing one’s identity
is part of any social interaction, social
media platforms
offer more freedom in terms of what,
how, and when these interactions are
edited. Editing allows
users to rehearse, change, package, or
literally craft a message and, in turn, to
rehearse, change,
and craft their personal appearance.
Editability allows the message sender
control over content and its flow and
uses because
THE SOCIAL MEDIA PRIVACY
MODEL 16
users have the chance to ponder the
consequences of their posts (Treem &
Leonardi, 2012).
Further, users may control the flow and
further use of their content by
articulating the lists of
online friends that they connect with or
by using a private or public profile
(Ellison & boyd,
2013). The availability of control
practices is highly supported by social
media’s affordance of
editability (Fox & McEwan, 2017).
Users have a choice to either intuitively
post their thoughts or
to pictorially represent their nonverbal
cues. Editing can be considered an
active enactment of
control because users deliberately
choose what to reveal to certain
audiences or what to withhold
(Crowley, 2017). Further, corrections
of one’s own posts and decisions are
possible and can also
be conceived as an enactment of
control (Crowley, 2017).
Control over the flow of information
might also foster subjective
experiences of privacy. In
privacy research, exerting control over
the flow of an interaction was often
understood
synonymously with control or as a
transmission principle that guaranteed
privacy (Nissenbaum,
2010).
Association. Social media platforms are
primarily used because they offer users
the chance
to connect with others and to stay in
touch. Users have articulated the idea
that communication is
their most important motive for using
social network sites (Quinn, 2016).
Consequently, the most
important affordance of social media is
the associations that are created or
maintained between
interaction partners (Ellison & boyd,
2013; Treem & Leonardi, 2012).
The affordance of association and the
chance to exert control over content, its
flow and
uses seem strikingly incompatible. In
their cross-sectional study of N = 875
Mechanical Turk
workers, Fox and McEwan (2017)
demonstrated that informational
control and network
association were negatively related.
Control is exerted by an individual
person, and as such, the
individual person is at the center if not
entirely responsible for achieving
privacy via control. This
is clearly expressed in users’ current
understanding of control. For example,
participants in
THE SOCIAL MEDIA PRIVACY
MODEL 17
Sarikakis and Winters’ (2017) study
identified the individual as the
legitimate “controller” of
privacy (p. 6). Also, boyd (2014)
argued that exerting control with the
aim of achieving privacy
requires the individual’s full
consideration, power, knowledge, and
skills. The only control
practice available would be to
completely withdraw (i.e., not to
participate) and to accept the
disadvantages that come along with
such a decision. Then, ambiguous and
passive enactments of
control might be used, but these have
the same disadvantages.
In sum, control as an issue and a
concept takes the perspective of the
individual. In other
words, it is an “ego-centered need”
(Papacharissi, 2010, p. 144). By
contrast, association is an
interindividual concept. Very likely,
one person’s control ends at the point
where another
person’s control starts. Hence, as much
as the social web is an interdependent
realm involving
other parties, control is not the means
to ensure or exert privacy. On the basis
of these
considerations—and this will be
important for the upcoming
propositions on social media
privacy—other means and mechanisms
to guarantee the subjective experience
of privacy have
become necessary: Users communicate
with each other to ensure privacy. And
further, if
communication is not possible, they
choose communication partners—
individuals, organizations,
institutions—that they can trust. Trust
has been shown to be crucial for
ensuring the perception of
privacy and subsequent online
disclosure (Metzger, 2004). Trust can
be established by personal
communication (Petronio, 2002) and by
norms that the individual user can rely
on (Saeri et al.,
2014). As such, in the upcoming
propositions and the social media
privacy model, trust,
communication, and norms are
conceptualized as the core mechanisms
to ensure privacy beyond
control.
Persistence. The affordance of
persistence addresses the durability of
online expressions
and content (boyd, 2014) and the idea
that after personal information is
published online, it is
automatically recorded and archived
and is consequently replicable (boyd,
2008). It means that
THE SOCIAL MEDIA PRIVACY
MODEL 18
data remain accessible in the same
form over long periods of time and for
diverse and unforeseen
audiences (Evans et al., 2017; Treem &
Leonardi, 2012). Some authors have
emphasized the
positive outcomes that persistence may
have, namely, that it allows knowledge
to be sustained,
creates robust forms of
communication, establishes the growth
of content (Treem & Leonardi,
2012), and allows for authenticity and
longevity (boyd, 2008).
However, as persistence comprises the
endless and infinite nature of online
data, it also
expresses a loss of informational self-
determination. Persistence seems
incompatible with
control. It evokes the idea that control
over content, its flow, and uses is
impossible because once
personal information is posted online,
it is no longer under the sender’s
control. With regard to
control practices, social media users
have the choice to completely
withdraw from online
interactions, to not post their
information, and thus to avoid its
persistence. At the same time, this
would prevent them from receiving the
myriad benefits that come along with
data sharing and
thus does not seem to be a question of
freedom of choice anymore. Also, as
soon as the choice is
made to participate, other control
practices such as consent and correction
significantly decrease.
Finally, once given, consent decreases
a person’s chances to subsequently
correct previous online
communication. Users know that they
do not have control over how persistent
their data will be,
and this significantly decreases their
subjective experience of privacy
(Rainie et al., 2013). The
lack of control over personal
information due to the persistence of
all kinds of online information
can be considered one of the key issues
of online lives and the subjective
experience of privacy
today. To react to users’ needs to
foresee and understand persistence
(Rainie et al., 2013),
communication, trust and norms seem
to be adequate ways to ensure privacy.
The Social Media Privacy Model
Social media privacy is based on
interpersonal processes of mutual
disclosure and
communication (Altman, 1975;
Petronio, 2002). Further, it can be
considered a value that is co-
THE SOCIAL MEDIA PRIVACY
MODEL 19
developed by engaging in
communication and that is expressed
by a shared perception
(Nissenbaum, 2010; Smith et al.,
2011). On the basis of these
considerations, I propose:
Proposition 1: Privacy is
interdependently perceived and valued.
In contrast to privacy, control is at the
center of the individual person. Control
is exerted
by the individual person, and it can be
considered to be ego-centered
(Papacharissi, 2010;
Sarikakis & Winter, 2017). Social
media platforms aim for
connectedness, interdependence, and
sociality and can be described as
social-centered (Ellison & boyd, 2013).
As a consequence,
social media privacy cannot be
sufficiently achieved by exerting
control. Other mechanisms are
necessary to ensure social media
privacy.
Proposition 2: Privacy cannot be
satisfactorily achieved by exerting
control in social media.
Instead of striving for control as an
end-game of privacy, the opposite is
necessary to
experience privacy in social media.
Users need to constantly communicate
with each other as
well as with institutions and companies
to ensure their privacy. They need to
engage in both
interpersonal and deliberative
communication processes.
Interpersonal communication is
understood as interactions between
users as well as interactions between
the individual user and
others who represent third parties such
as institutions and companies.
Deliberation is defined as
either informal or institutionalized
interaction among internet users (and
eventually
representatives of governments,
institutions, or companies), involving
rational-critical decision
making and the earnest aim to find a
solution (Burkhalter, Gastil, &
Kelshaw, 2002).
Proposition 3: Interpersonal
communication is a mechanism by
which social media privacy can
interdependently ensured and put into
effect.
However, not all actions and steps of
online media use can be accompanied
by
communication processes. For many if
not most questions of privacy, people
can rely on past
experiences. Here, communication and
deliberation crystallize into a stable
relationship-based
THE SOCIAL MEDIA PRIVACY
MODEL 20
result or even solution, i.e. trust (or
mistrust) and norms (or anomia). Trust
can be defined as an
anticipation and expectation of how a
person or institution will behave and as
such reduces
uncertainty (Waldman, 2018, p. 4).
Trust has been shown to ensure privacy
on the basis of
longstanding communication and
reliable bonds (Saeri et al., 2014). Trust
can be conceived as
both a crucial factor of influence in
decisions over self-disclosure and as a
result of
communication (Saeri et al., 2014). In
turn, mistrust—which has not yet been
addressed in
privacy research—is a menace to
privacy and as such should initiate
communication. In a
qualitative study on privacy
perceptions, Teutsch, Masur, and
Trepte (2018) demonstrated that
participants perceived that they had lost
control and had substituted trust for
control. One of the
interview partners said, “Well, privacy
is absolute trust between conversational
partners and
absolute, absolute certainty that the
subject of conversation will stay within
this sphere” (p. 7).
Eichenhofer (2019) suggested that the
“trust paradigm” should point toward a
more current
perspective on privacy regulation via
trust in contrast to privacy regulation
via control or self-
determination.
In the case of privacy regulation, both
social and legislative norms come into
play (Gusy,
2018; Spottswood & Hancock, 2017;
Utz & Krämer, 2009). Social norms are
understood as
social pressure to engage in a certain
kind of behavior and are established by
what others approve
of (injunctive norms) and what they
actually do (descriptive norms) (Saeri
et al., 2014). Although
legislative norms are coined by
jurisprudence, regulated by law (and
not on the basis of observing
others), they share with social norms in
that they prescribe a certain behavior
and that they allow
for sanctions in case this behavior is
not shown. Previous research has
shown that trust, and
norms are the keys to obtaining privacy
(Marwick & boyd, 2014; Quinn, 2014).
To establish trust
and norms, of course, communication
is necessary.
Proposition 4: Trust and norms
function as privacy mechanisms that
represent
THE SOCIAL MEDIA PRIVACY
MODEL 21
crystallized privacy communication.
I suggest that control and
communication have found a new
balance in social media
communication: Control is losing
control and communication is gaining
power. In other words,
users do not solely rely on and strive
for control in the first place but strive to
communicate about
privacy to establish norms and trust
and even sometimes to regain control.
In fact, privacy’s
interdependence is expressed and put
forward by interpersonal
communication. This emerging
social turn in privacy theory is also
acknowledged in the definition of
privacy.
I define privacy by an individual’s assessments
of (a) the level of access to this
person in an interaction or relationship with
others (people, companies,
institutions) and (b) the availability of the
mechanisms of control,
interpersonal communication, trust, and norms
for shaping this level of access
through (c) self-disclosure as (almost intuitive)
behavioral privacy regulation
and (d) control, interpersonal communication,
and deliberation as means for
ensuring (a somewhat more elaborated)
regulation of privacy. In social media,
then, the availability of the mechanisms that
can be applied to ensure privacy
are crucially influenced by the content that is
being shared and the social media
affordances that determine how this content is
further used.
In the following, I will summarize the
theoretical rationale developed in this
article in the
chronology of a communicative
process. Further, I will show how the
individual’s privacy
assessments referred to in the four
propositions eventually lead to different
forms of privacy
regulation behaviors. The process is
illustrated in Figure 1. The following
steps are meant to
make the model accessible for
empirical investigation.
The first part of the flowchart refers to
the social media user’s subjective and
initial
assessment: All humans have
individual levels of access they
perceive as being more or less
adequate and comfortable. This
dispositional level of access is a
quantifiable dimension varying
THE SOCIAL MEDIA PRIVACY
MODEL 22
between high and low levels and
expressing the individual’s
dispositional willingness to self-
disclose. In contrast, the
communication goal is rooted in the
situation and defines what is to be
achieved in this particular situation.
The individual’s communication goals
in social media are
manifold and important to consider
when assessing online privacy. They
can most likely be
understood as a qualitative scenario of
what the individual user wants and
needs to communicate
in a certain situation. Hence, the point
of departure for each and any
consideration about privacy
are the more or less consciously asked
questions: How do I feel, what do I
need, and what is my
goal for this particular situation?
The second part of the model refers to
the social media boundary conditions
that are
encountered. Here, content and
affordances dynamically interact (as
indicated with the
multiplication sign) with an
individual’s initial assessment. The
individual weighs the ideal level
of access and his/her communication
goals against social media boundary
conditions by
considering what content is shared,
where; how it might flow from one user
or institution to
another; and, how it might be used.
Social media content becomes dynamic
as it is displayed and
shared. Affordances represent this
dynamic and, together with the
individual’s dispositions and
goals, shape the available privacy
mechanisms: control, trust, norms, and
interpersonal
communication. Hence, users assess
whether they have a choice, whether
they can rely on trust or
norms, or whether they will (have to)
engage in interpersonal
communication.
The third part of the model refers to the
subjective experience of privacy. The
individual
experiences a certain level of access
that results from the individual’s goals
on the one hand and
the social media boundary conditions
and privacy mechanisms that are
applied to actively
regulate privacy on the other. This
experience is here understood as the
rather unfiltered
accumulation of external stimuli and
internal needs and, then, results in a
more elaborated re-
assessment, i.e. the privacy perception
that can be verbalized, and is
empirically accessible.
THE SOCIAL MEDIA PRIVACY
MODEL 23
The privacy perception results in
different forms of privacy regulation
behaviors. First,
self-disclosure belongs to the most
intuitive regulation behaviors and
includes all information
intentionally shared (or not shared)
with others. And, for the case that the
privacy mechanism of
control is available and considered
adequate for a certain communication
goal, users exert control
actively and intentionally by restricting
access to information, audience
segregation, self-
censorship, encryption; or, rather
ambiguously by softening the truth,
obfuscating information.
When the privacy mechanisms do not
allow the deliberate and somewhat
egocentric privacy
regulation (i.e. when individuals do not
have at least partial control), other
regulation behaviors
come into play. The individual might
engage in interpersonal communication
or even deliberation
to negotiate and interdependently
regulate privacy. Interpersonal
communication, deliberation,
and control are meta-level regulation
behaviors that come into play when
privacy behaviors are
not intuitive, when contexts collapse, or
when a certain situation demands
further elaboration
and/or communication.
The communication process shown in
Figure 1 will take turns in a constant
flow of
assessments and re-assessments as soon
as the reactions of others lead to
changes in conditions or
when personal goals or needs change.
In what follows, I will discuss the
capabilities and
consequences of the model’s theoretical
propositions. What are the possible
effects and what are
the pitfalls and opportunities that will
occur if communication, trust, and
norms are substituted
for control?
Challenging the Social Media Privacy
Model
Social media use is at the heart of
human communication and offers all of
its many merits
such as practicing freedom of speech or
reaching out to people in critical living
conditions and
providing them with social support. In
addition, social media communication
offers particular
benefits because it is ubiquitous and
independent of time and location. As
such, for example, it
THE SOCIAL MEDIA PRIVACY
MODEL 24
allows people to communicate across
borders into the lands of friends, foes,
and fiends. All of
these merits of online communication
give its users enormous freedom. This
freedom is—and
this is again one of the major merits of
social media communication—often
independent of the
domestic or cultural situation of the
user. Freedom is historically closely
linked to privacy. In
mediaeval times, only those who had
land were free (Moore, 1984). And
only those who
possessed land had the freedom to
withdraw, grant, or restrict access to
their land. In turn, the
“unfree,” who did not have their own
land or possessions, did not have the
right to privacy. In
this sense, freedom is the origin both in
legislation and the genealogy of privacy
(Gusy, 2018).
For social media and online realms,
freedom and privacy are closely
connected. However,
the ideas of possessions and ownership
do not seem fully applicable anymore.
Borders and
possessions have become fuzzy
because, very often, data and personal
information are perceived
as shared goods as soon as they appear
online. Nissenbaum (2010)
summarized her work on
contextual integrity with the words:
“We have a right to privacy, but it is
neither a right to control
personal information nor a right to
have access to this information
restricted” (p. 231). She
conceded that for social media, privacy
is rather a value and a perception.
Borders and possessions have a
permanent nature. By contrast, values
and perception are
subject to interpretation and change.
This understanding of privacy as
subject to interpretation
and change makes it a matter of
communication. Eventually,
communication about privacy will
result in trust, and if successfully
shared in a society, in social and
legislative norms. However,
communication cannot be understood
as a long and painful process that will
finally show the
solution and lead us into the light of
privacy. Just the opposite is the case. In
social media,
communication about data sharing and
the use and flow of interpersonal data
is the solution itself.
Only due to ongoing and critical
assessment, reassessment, and dynamic
communication will we
have the chance to ensure privacy as
one of the most important values of
civilized societies.
THE SOCIAL MEDIA PRIVACY
MODEL 25
Privacy’s interdependence is expressed
and put forward by interpersonal
communication.
And in fact, privacy is experiencing a
“social turn”, in social media and
beyond (Helm &
Eichenhofer, 2019). This emerging
social turn in privacy theory is
acknowledged in the social
media privacy model. However, there
are downsides to a conception of
privacy as a
communicative process. First, not all
members of this communicative
process will have equal
chances of being heard. For example,
digital and literacy gaps have been
shown to crucially
influence participation in these
communication processes (Helsper,
2017).
Second, online information has become
a commodified good, and financial
interests are
stark (Sevignani, 2016). Purposeless
communication among friends is
increasingly exploited for
economic reasons (Seubert & Becker,
2019); and, companies do their best to
avoid interactions
between individual users who strive to
regulate their privacy. In turn, they try
to establish trust
through strong branding activities that
have been shown to override social
media users’ privacy
concerns and their interest in solving
privacy issues by actively
communicating and participating
(Boerman, Kruikemeier, & Zuiderveen
Borgesius, 2018; Li, 2011).
Third, as a consequence, the
requirements of communication and
trust demand a lot from
users. Control implies a settled state in
which the individual person can lie
back and stop thinking
about the flow of personal information
online. Communication, trust, and
norms, by contrast, are
subject to change and thus require
constant assessment and consideration.
Hence, information
control should also be considered with
regard to an individual’s ability to
exert control (Grimm &
Bräunlich, 2015). The users-centered
perspective needs to be complemented
and accompanied
with a system based perspective and
respective interventions (Schäwel,
2020).
Fourth, this demanding process of
communication might also result in a
threat to the
ability to develop a self-determined
identity. Westin (1967) held that
boundary control means
identity control: “This deliberate
penetration of the individual’s
protective shell, his
THE SOCIAL MEDIA PRIVACY
MODEL 26
psychological armor, would leave him
naked to ridicule and shame and would
put him under the
control of those who knew his secrets”
(p. 33). As a consequence, lost control
would mean
threats to identity development.
Finally, I embrace the demanding and
somewhat stressful nature of
communicating about
privacy in social media. In social
media, there is only limited control
over personal information.
In addition, the handling of this lack of
control is demanding and stressful. By
acknowledging
these two observations, users will
acknowledge that they need to take
action, engage in
communication, and establish trust and
shared social and legislative norms. In
social media,
privacy is not a private affair. It is at
the center of communication. We are
out of control because
we have so much to share. Hence,
interpersonal communication, trust,
and norms are the three
most important mechanisms that
interdependently help to ensure social
media privacy.
THE SOCIAL MEDIA PRIVACY
MODEL 27
References
Altman, I. (1975). The environment
and social behavior: Privacy, personal
space, territory,
crowding. Monterey, CA:
Brooks/Cole Publishing Company.
Anderson, J., Rainie, L., & Duggan,
M. (2014). Digital life in 2025.
Retrieved from
http://www.pewinternet.org/
2014/03/11/digital-life-in-2025/
Ben-Ze'ev, A. (2003). Privacy,
emotional closeness, and openness in
cyberspace. Computers
in Human Behavior, 19(4), 451–567.
https://doi.org/10.1016/S0747-
5632(02)00078-X
Boerman, S. C., Kruikemeier, S., &
Zuiderveen Borgesius, F. J. (2018).
Exploring
motivations for online privacy
protection behavior: Insights from
panel data.
Communication Research, 25.
https://doi.org/10.1177/009365021880
0915
Boyd, d. (2008). Taken out of context.
American teen sociality in networked
publics (Doctoral
dissertation). University of California,
Berkeley.
Boyd, d. (2014). It's complicated. The
social lives of networked teens. New
Haven, CT: Yale
University Press.
Brandimarte, L., Acquisti, A., &
Loewenstein, G. (2013). Misplaced
confidences: Privacy and
the control paradox. Social
psychological and personality science,
4(3), 340–347.
https://doi.org/
10.1177/1948550612455931
Brummett, E. A., & Steuber, K. R.
(2015). To reveal or conceal? Privacy
management
processes among interracial romantic
partners. Western Journal of
Communication, 79(1),
22–44.
https://doi.org/10.1080/10570314.201
4.943417
Burgoon, J. K. (1982). Privacy and
communication. Communication
Yearbook, 6(4), 206–
249. https://doi.org/10.1080/23808985
Burkhalter, S., Gastil, J., & Kelshaw,
T. (2002). A conceptual definition and
theoretical model
of public deliberation in small face-to-
face groups. Communication Theory,
12(4), 398–
422.
https://doi.org/10.1093/ct/12.4.398
Buss, A. (2001). Psychological
dimensions of the self. Thousand
Oaks, CA: Sage.
THE SOCIAL MEDIA PRIVACY
MODEL 28
Carr, C. T., & Hayes, R. A. (2015).
Social media: defining, developing,
and divining. Atlantic
Journal of Communication, 23(1), 46–
65.
https://doi.org/10.1080/15456870.201
5.972282
Cho, H., Lee, J.-S., & Chung, S.
(2010). Optimistic bias about online
privacy risks: Testing
the moderating effects of perceived
controllability and prior experience.
Computers in
Human Behavior, 26, 987–995.
https://doi.org/10.1016/j.chb.2010.02.
012
Crowley, J. L. (2017). A framework
of relational information control: A
review and extension
of information control research in
interpersonal contexts.
Communication Theory, 27(2),
202–222.
https://doi.org/10.1111/comt.12115
Derlega, V. J., Metts, S., Petronio, S.,
& Margulis, S. T. (1993). Self-
disclosure. Sage series
on close relationships. Newbury Park,
CA: Sage Publications.
Dienlin, T. (2014). The privacy
process model. In S. Garnett, S. Halft,
M. Herz, & J. M.
Mönig (Eds.), Medien und Privatheit
[Media and privacy] (pp. 105–122).
Passau,
Germany: Karl Stutz.
Eastin, M. S., Brinson, N. H., Doorey,
A., & Wilcox, G. (2016). Living in a
big data world:
predicting mobile commerce activity
through privacy concerns. Computers
in Human
Behavior, 58, 214–220.
https://doi.org/10.1016/j.chb.2015.12.
050
Eichenhofer, J. (2019). e-Privacy -
Theorie und Dogmatik eines
europäischen
Privatheitsschutzes im Internet-
Zeitalter [Theoretical and doctrinal
foundations of a
European privacy protection
regulation in the internet age].
Bielefeld: University of
Bielefeld.
Ellison, N. B., & boyd, d. (2013).
Sociality through social network sites.
In W. H. Dutton
(Ed.), The Oxford handbook of
Internet studies (pp. 151–172).
Oxford, UK: Oxford
University Press.
European Commission. (2015).
Special Eurobarometer 431: Data
protection. Brussels, BE.
Retrieved from
http://ec.europa.eu/public_opinion/arc
hives/ebs/ebs_431_en.pdf
THE SOCIAL MEDIA PRIVACY
MODEL 29
Evans, S. K., Pearce, K. E., Vitak, J.,
& Treem, J. W. (2017). Explicating
affordances: A
conceptual framework for
understanding affordances in
communication research. Journal
of Computer-Mediated
Communication, 22(1), 35–52.
https://doi.org/10.1111/jcc4.12180
Fox, J., & McEwan, B. (2017).
Distinguishing technologies for social
interaction: The
perceived social affordances of
communication channels scale.
Communication
Monographs, 84(3), 298–318.
https://doi.org/10.1080/03637751.201
7.1332418
Gibson, J. J. (2014). The ecological
approach to visual perception.
Psychology Press &
Routledge Classic Editions. Hoboken,
NJ: Taylor and Francis (Original work
published
1979).
Grimm, R., & Bräunlich, K. (2015).
Vertrauen und Privatheit [Trust and
privacy]. DuD
Datenschutz und Datensicherheit
[Data protection and data security], 5,
289–294.
Gusy, C. (2018). Datenschutz als
Privatheitsschutz oder Datenschutz
statt Privatheitsschutz?
[Data protection as privacy protection
or privacy protection as data
protection?].
Europäische Grundrechte Zeitschrift
[European Fundamental Rights
Journal], 45(9-12),
244–255.
Helm, P., & Eichenhofer, C. (2019).
Reflexionen zu einem social turn in
den privacy studies.
In C. Aldenhoff, L. Edeler, Hennig,
M., Kelsch, J., L. Raabe, & F. Sobala
(Eds.),
Digitalität und Privatheit [Digitality
and Privacy] (pp. 139–166). Bielefeld,
Germany:
transcript.
https://doi.org/10.14361/97838394466
14-009
Helsper, E. J. (2017). The social
relativity of digital exclusion:
Applying relative deprivation
theory to digital inequalities.
Communication Theory, 27(3), 223–
242.
https://doi.org/10.1111/comt.12110
Howard, P. N., & Parks, M. R. (2012).
Social media and political change:
Capacity,
constraint, and consequence. Journal
of Communication, 62(2), 359–362.
https://doi.org/10.1111/j.1460-
2466.2012.01626.x
THE SOCIAL MEDIA PRIVACY
MODEL 30
Johnson, C. A. (1974). Privacy as
personal control. In S. T. Margulis
(Ed.), Man-environment
interactions: Evaluations and
applications (pp. 83–100).
Stroudsburg, PA: Dowden,
Hutchinson & Ross.
Joinson, A. N. (2001). Self-disclosure
in computer-mediated
communication: The role of self-
awareness and visual anonymity.
European Journal of Social
Psychology, 31(2), 177–192.
https://doi.org/10.1002/ejsp.36
Laufer, R. S. [R. S.], & Wolfe, M.
(1977). Privacy as a concept and a
social issue: A
multidimensional developmental
theory. Journal of Social Issues, 33(3),
22–42.
https://doi.org/10.1111/j.1540-
4560.1977.tb01880.x
Li, Y. (2011). Empirical studies on
online information privacy concerns:
Literature review
and an integrative framework.
Communications of the Association
for Information Systems,
28(1), 453–496. Retrieved from
http://aisel.aisnet.org/cais/vol28/iss1/2
8
Madden, M. (2014). Public
perceptions of privacy and security in
the post-Snowden era.
Retrieved from
http://www.pewinternet.org/2014/11/1
2/public-privacy-perceptions/
Madden, M., & Rainie, L. (2015).
Americans’ attitudes about privacy,
security and
surveillance. Retrieved from
http://www.pewinternet.org/2015/05/2
0/americans-attitudes-
about-privacy-security-and-
surveillance/
Marwick, A. E., & boyd, d. (2014).
Networked privacy. How teenagers
negotiate context in
social media. New Media & Society,
16(7), 1051–1067.
https://doi.org/
10.1177/1461444814543995
Masur, P. K. (2019). Situational
privacy and self-disclosure:
Communication processes in
online environments. Cham,
Switzerland: Springer International
Publishing.
Metzger, M. J. (2004). Privacy, trust,
and disclosure: Exploring barriers to
electronic
commerce. Journal of Computer-
Mediated Communication, 9(4).
https://doi.org/10.1111/j.1083-
6101.2004.tb00292.x
THE SOCIAL MEDIA PRIVACY
MODEL 31
Moor, J. H. (1997). Towards a theory
of privacy in the information age.
ACM SIGCAS
Computers and Society, 27(3), 27–32.
https://doi.org/10.1145/270858.27086
6
Moore, B. (1984). Privacy: Studies in
social and cultural history. Armonk,
N.Y.: M.E.
Sharpe.
Nissenbaum, H. (2010). Privacy in
context: Technology, policy, and the
integrity of social
life. Palo Alto, CA: Stanford
University Press.
Ochs, C., & Büttner, B. (2018). Das
Internet als "Sauerstoff" und
"Bedrohung" [The internet
as oxygen and menace]. In M.
Friedewald (Ed.), DuD-Fachbeiträge.
Privatheit und
selbstbestimmtes Leben in der
digitalen Welt [Privacy and a self-
determined life in a
digital world] (pp. 33–80).
Wiesbaden, Germany: Springer
Vieweg.
Papacharissi, Z. (2010). A private
sphere: Democracy in a digital age.
Cambridge: Polity
Press.
Pedersen, D. M. (1999). Model for
types of privacy by privacy functions.
Journal of
Environmental Psychology, 19, 397–
405.
https://doi.org/10.1006/jevp.1999.014
0
Petronio, S. (2002). Boundaries of
privacy. Albany, NY: State University
of New York Press.
Qian, H., & Scott, C. R. (2007).
Anonymity and self-disclosure on
weblogs. Journal of
Computer-Mediated Communication,
12(4), 1428-1451.
https://doi.org/10.1111/j.1083-
6101.2007.00380.x
Quinn, K. (2014). An ecological
approach to privacy: “Doing” online
privacy at midlife.
Journal of Broadcasting & Electronic
Media, 58(4), 562–580.
https://doi.org/
10.1080/08838151.2014.966357
Quinn, K. (2016). Why we share: A
uses and gratifications approach to
privacy regulation in
social media use. Journal of
Broadcasting & Electronic Media,
60(1), 61–86.
https://doi.org/
10.1080/08838151.2015.1127245
THE SOCIAL MEDIA PRIVACY
MODEL 32
Rainie, L., Kiesler, S., Kang, R., &
Madden, M. (2013). Anonymity,
privacy, and security
Online. Retrieved from
http://www.pewinternet.org/2013/09/0
5/anonymity-privacy-and-
security-online/
Ramirez, A., Bryant, S., Erin, M.,
Fleuriet, C., & Cole, M. (2015). When
online dating
partners meet offline: The effect of
modality switching on relational
communication
between online daters. Journal of
Computer-Mediated Communication,
20(1), 99–114.
https://doi.org/10.1111/jcc4.12101
Saeri, A. K., Ogilvie, C., La Macchia,
S. T., Smith, J. R., & Louis, W. R.
(2014). Predicting
facebook users' online privacy
protection: Risk, trust, norm focus
theory, and the theory of
planned behavior. The Journal of
Social Psychology, 154(4), 352–369.
https://doi.org/
10.1080/00224545.2014.914881
Sarikakis, K., & Winter, L. (2017).
Social media users’ legal
consciousness about privacy.
Social Media + Society, 3(1), 1-14.
https://doi.org/10.1177/205630511769
5325
Schäwel, J. (2020). How to raise
users’ awareness of online privacy.
Duisburg, Germany:
University of Duisburg-Essen.
Schmidt, J.-H. (2014). Twitter and the
rise of personal publics. In K. Weller,
A. Bruns, J.
Burgess, M. Mahrt, & C. Puschmann
(Eds.), Digital formations: Vol. 89.
Twitter and
society (pp. 3–14). New York: Peter
Lang.
Seubert, S., & Becker, C. (2019). The
culture industry revisited:
Sociophilosophical
reflections on ‘privacy’ in the digital
age. Philosophy & Social Criticism,
45(8), 930–947.
https://doi.org/
10.1177/0191453719849719
Sevignani, S. (2016). Privacy and
capitalism in the age of social media.
Routledge research
in information technology and society:
Vol. 18. New York, NY: Routledge.
Slater, M. D. (2007). Reinforcing
spirals: The mutual influence of media
selectivity and
media effects and their impact on
individual behavior and social
identity. Communication
Theory, 17(3), 281–303.
https://doi.org/10.1111/j.1468-
2885.2007.00296.x
THE SOCIAL MEDIA PRIVACY
MODEL 33
Smith, H. J., Dinev, T., & Xu, H.
(2011). Information privacy research:
an interdisciplinary
review. Mis Quarterly, 35(4), 989–
1016.
Spottswood, E. L., & Hancock, J. T.
(2017). Should I share that?
Prompting social norms that
influence privacy behaviors on a
social networking site. Journal of
Computer-Mediated
Communication, 22(2), 26.
https://doi.org/10.1111/jcc4.12182
Taneja, A., Vitrano, J., & Gengo, N. J.
(2014). Rationality-based beliefs
affecting individual’s
attitude and intention to use privacy
controls on facebook: An empirical
investigation.
Computers in Human Behavior, 38,
159–173.
https://doi.org/10.1016/j.chb.2014.05.
027
Tavani, H. T. (2007). Philosophical
theories of privacy: Implications for
an adequate online
privacy policy. Metaphilosophy,
38(1), 1–22.
https://doi.org/10.1111/j.1467-
9973.2006.00474.x
Tavani, H. T., & Moor, J. H. (2001).
Privacy protection, control of
information, and privacy-
enhancing technologies. ACM
SIGCAS Computers and Society,
31(1), 6–11.
https://doi.org/
10.1145/572277.572278
Teutsch, D., Masur, P. K., & Trepte,
S. (2018). Privacy in mediated and
nonmediated
interpersonal communication: How
subjective concepts and situational
perceptions
influence behaviors. Social Media +
Society, 4(2), 1-14.
https://doi.org/
10.1177/2056305118767134
Treem, J. W., & Leonardi, P. M.
(2012). Social media use in
organizations. Exploring the
affordances of visibility, editability,
persistence, and association.
Communication
Yearbook, 36, 143–189.
https://doi.org/10.1080/23808985.201
3.11679130
Trepte, S., & Masur, P. K. (2017).
Need for privacy. In Zeigler-Hill, V.,
Shakelford, T. K.
(Ed.), Encyclopedia of personality and
individual differences. London, UK:
Springer.
https://doi.org/10.1007/978-3-319-
28099-8_540-1
Trepte, S., & Reinecke, L. (Eds.).
(2011). Privacy online. Perspectives
on privacy and self-
disclosure in the social web. Berlin,
Germany: Springer.
THE SOCIAL MEDIA PRIVACY
MODEL 34
Trepte, S., Reinecke, L., Ellison, N.
B., Quiring, O., Yao, M. Z., &
Ziegele, M. (2017). A
cross-cultural perspective on the
privacy calculus. Social Media +
Society, 3(1), 1-13.
https://doi.org/
10.1177/2056305116688035
Tsay-Vogel, M., Shanahan, J., &
Signorielli, N. (2018). Social media
cultivating perceptions
of privacy: A 5-year analysis of
privacy attitudes and self-disclosure
behaviors among
Facebook users. New Media &
Society, 20(1), 141–161.
https://doi.org/
10.1177/1461444816660731
Utz, S., & Krämer, N. (2009). The
privacy paradox on social network
sites revisited. The role
of individual characteristics and group
norms. Journal of Psychosocial
Research on
Cyberspace, 3(2). Retrieved from
http://cyberpsychology.eu/view.php?
cisloclanku=2009111001&article=2
Vitak, J. (2012). The impact of
context collapse and privacy on social
network site
disclosures. Journal of Broadcasting &
Electronic Media, 56(4), 451–470.
https://doi.org/
10.1080/08838151.2012.732140
Waldman, A. E. (2018). Privacy as
Trust. Cambridge, UK: Cambridge
University Press.
https://doi.org/
10.1017/9781316888667
Walther, J. B. (1996). Computer-
mediated communication. Impersonal,
interpersonal, and
hyperpersonal interaction.
Communication Research, 23(1), 3–
43.
https://doi.org/
10.1177/009365096023001001
Warren, S. D., & Brandeis, L. D.
(1890). The right to privacy. Harvard
Law Review, 4(5),
193–220.
Westin, A. F. (1967). Privacy and
freedom. New York, NY: Atheneum.
Wolfe, M., & Laufer, R. (1974). The
concept of privacy in childhood and
adolescence. In S.
T. Margulis (Ed.), Man-environment
interactions: Evaluations and
applications (pp. 29–
54). Stroudsburg, PA: Dowden,
Hutchinson & Ross.
THE SOCIAL MEDIA PRIVACY
MODEL 2
Abstract
Privacy has been defined as the
selective control of information
sharing, where control is key. For
social media, however, an individual
user’s informational control has
become more difficult. In
this theoretical article, I review how
the term control is part of theorizing on
privacy, and I
develop an understanding of online
privacy with communication as the
core mechanism by which
privacy is regulated. The results of this
article’s theoretical development are
molded into a
definition of privacy and the social
media privacy model. The model is
based on four
propositions: Privacy in social media is
interdependently perceived and valued.
Thus, it cannot
always be achieved through control. As
an alternative, interpersonal
communication is the
primary mechanism by which to ensure
social media privacy. Finally, trust and
norms function as
mechanisms that represent crystallized
privacy communication. Further
materials are available at
https://osf.io/xhqjy/
Keywords: privacy, control, social
media, affordances, communication,
social media
privacy model, definition of privacy
THE SOCIAL MEDIA PRIVACY
MODEL 3
The Social Media Privacy Model:
Privacy and Communication in the
Light of Social Media
Affordances
In historical and current theories about
privacy, control has been perceived as
an
important defining term. The majority
of privacy scholars understand control
as the means by
which to regulate and finally
experience privacy (Altman, 1975;
Burgoon, 1982; Petronio, 2002).
The underlying assumption is that the
more users can control access to their
personal lives, or—
more technically—to their data, the
more privacy they experience. Also, the
most current
understanding held by social media
users is that they need control to
achieve privacy and
informational self-determination
(Marwick & boyd, 2014). The majority
of 80% to 90% of U.S.
Americans (Madden & Rainie, 2015)
and Europeans (European Commission,
2015) say that it is
important to them to be in control of
determining who can obtain
information about them and
what information is collected about
them (see also Sarikakis & Winter,
2017).
There is no question that users face
decreasing informational control while
communicating via social media. Due
to their networked nature, social media
applications do not
allow users to control what friends,
acquaintances, institutions, or
companies do with the
information, pictures, and stories that
are shared online (Marwick & boyd,
2014). Further, social
media communication takes place in
larger and larger parts of users’ lives.
All of the more
current applications and devices
aggregate information and exert
automatic control (Anderson,
Rainie, & Duggan, 2014). As a reaction
to the increasing amounts of data that
are exchanged and
the sociality of such data, 91% of users
perceive that they have lost control
over how their
personal information is collected and
used by friends, acquaintances, and
colleagues (Quinn,
2014) and especially by companies and
governments (Madden, 2014).
These two observations—the
understanding of privacy as control on
the one hand and the
experience of decreasing control over
information while using social media
on the other—can be
THE SOCIAL MEDIA PRIVACY
MODEL 4
termed a control issue of privacy. In the
remainder of this article, I will suggest
an understanding
of privacy that is adequate for social
media use and the requirements
emerging from this issue.
The Relationship of Privacy and
Control
Privacy is a concept that has been
considered and defined in very
different disciplines,
from descriptive, empirical, and
normative perspectives (Sevignani,
2016; Trepte & Reinecke,
2011). In earlier days, privacy was
considered a human right and identified
as the “right to be let
alone” (Warren & Brandeis, 1890, p.
75). Later and more specifically,
privacy was defined as
“the claim of individuals, groups, or
institutions to determine for themselves
when, how, and to
what extent information about them is
communicated to others” (Westin,
1967, p. 7) or “the
selective control of access to the self”
(Altman, 1975, p. 24).
Informational control has only seldom
been defined, but the most common
definitions
touch either a static or behavioral
aspect of control: Informational control
foremost means that
owners of a certain piece of
information have a choice over
whether, when, and to what extent
they will disclose or withhold personal
information (Crowley, 2017; Tavani,
2007). Here control
is static, a question of more or less, yes
or no. It can be understood as an option
or an available
mechanism. Then, control can be
exerted actively (e.g., restricting access
to information,
audience segregation, self-censorship,
encryption), ambiguously (e.g.,
softening the truth,
obfuscating information, or engaging
in other forms of partial disclosure), or
passively (e.g.
unintentionally omitting information)
(Crowley, 2017; Ochs & Büttner,
2018). In this rather
behavioral understanding,
informational control is executed and
experienced by the individual
person. In both perspectives, control is
centered around the individual and
individual decision
making.
The majority of privacy theories are
devoted to two—somewhat
contradictory—
paradigms: I will call the first
paradigm “privacy as control,” because
here, privacy and control
THE SOCIAL MEDIA PRIVACY
MODEL 5
are strongly connected, and, the second
paradigm “privacy and control,”
because here, privacy
and control are treated as separate
constructs with conditional
relationships. I will then suggest a
third perspective that redefines the
meaning and impact of control and the
conditions among
which control becomes relevant. This
perspective will be summarized in the
social media privacy
model.
Paradigm 1: Privacy as Control
In the seminal work by Altman (1975)
and the privacy regulation model of
self-disclosure
(Derlega, Metts, Petronio, & Margulis,
1993), control was set forth as the
crucial mechanism of
privacy. More recent
conceptualizations have also referred to
control as a precondition of privacy
(Petronio, 2002). Even in their very
first conceptualizations of privacy,
Warren and Brandeis
(1890) referred to privacy as the right
to control what others publish about
oneself. In an
overview of privacy theories, Smith,
Dinev, and Xu (2011) investigated 448
publications on
privacy. They found that—besides an
understanding of privacy as a value—
the cognate-based
understanding of privacy as control has
dominated the social sciences.
The vast majority of privacy scholars
have referred to control as a dynamic
behavior in
the process of privacy regulation to
grant access or to deny access. Altman
(1975) suggested a
process model with three steps: First,
an individual assesses the desired level
of privacy; then the
individual eventually regulates privacy
by controlling interpersonal
boundaries; and then, the
individual again assesses the achieved
level of privacy. In his flow-model the
crucial role that
was assigned to control becomes
apparent. On the basis of this notion,
Petronio (2002) articulated
how control is the engine of privacy
management. In her understanding, an
individual jointly
manages and coordinates rules with
others while interacting with them.
Here again, control is not
only the behavior through which
privacy can be gained, but control is
also the means by which to
measure the status quo of privacy, and
in turn, it will foster the extent to which
privacy regulation
THE SOCIAL MEDIA PRIVACY
MODEL 6
is further engaged in through an
exertion of control.
Privacy scholars have also referred to
the question of what is being
controlled. Here, in
particular, the control of access to
boundaries and the control of the flow
of an interaction were
addressed as the topics or processes
that needed to be controlled (Johnson,
1974; Wolfe &
Laufer, 1974). Further, control over
stimuli that impinge upon a person
were articulated as things
that need to be controlled (Wolfe &
Laufer, 1974). Margulis (1974)
explained that control refers
to all matters being exchanged between
individuals: “Privacy, as a whole or in
part, represents the
control of transactions between
person(s) and other(s)…” (p. 77).
In some theories, control has been used
almost interchangeably with privacy.
For
example, Derlega et al. (1993) stated
that “…privacy represents control over
the amount and kind
of information exchange that persons
have with one another” (p. 67). Then,
Burgoon (1982)
differentiated between four dimensions
of privacy, all of which refer to how
much control an
individual has: Possessing physical
privacy refers to whether and how
much control an individual
perceives to have over physical
boundaries. Social privacy refers to
how much control an
individual perceives to have over the
access of others to the person’s
environments.
Psychological privacy refers to how
much control an individual perceives to
have over emotional
and cognitive input and output. Finally,
informational privacy refers to how
much control an
individual perceives to have over the
use of personal data. In this
conceptualization, the ability to
exert control is the key to an
individual’s privacy perception and, in
turn, regulation. Many
empirical studies have addressed the
relationship between control and
privacy, but only a
minority of studies have supported the
notion that privacy behavior is related
to informational
control (Brandimarte, Acquisti, &
Loewenstein, 2013).
In sum, studies that have been based on
this first paradigm have underscored
the idea that
individuals exert control to achieve
privacy. In turn, privacy should be
achieved if a certain level
THE SOCIAL MEDIA PRIVACY
MODEL 7
of control is successfully executed and
maintained as the status quo. However,
these
conceptualizations of privacy suggest a
linear relationship between privacy and
control. They
assume that “…the more one has
control over this information exchange,
the greater the amount
of privacy one has in a social
relationship” (Derlega et al., 1993, p.
67). However, previous
empirical research did not find a linear
relationship between privacy and
control. Hence, there is
a mismatch between the theoretical
assumption that privacy and control are
closely related on the
one hand and the rare empirical data
supporting this notion on the other.
Paradigm 2: Privacy and Control
In social media, an individual person
cannot easily execute control because
personal
information is exchanged between
many parties and with a broad range of
applications. Users
experience social media as more
confusing, demanding, and complex
with regard to the control
that they have over their personal
information than face-to-face
communication (Marwick
& boyd, 2014; Quinn, 2014). Woo
(2016) expressed this confusion while
mimicking the
presumed thoughts of a user: “Please
do contact me and give me benefits, but
I still do not want
to fully give up my control (but I do
not know how to have that control)” (p.
954). In other
words, users want to take advantage of
the networked nature of social media,
are painfully aware
of the deficits in control, but have not
yet found solutions for how to embrace
their needs for both
gratification and informational control.
This process of weighing privacy risks
and social
gratifications has also been
investigated under the umbrella of the
privacy calculus (Trepte et al.,
2017).
To follow up on the sentiment that
users wish to have informational
control but that
control seems to contradict the
networked nature of social media,
Moor (1997) and later Tavani
(2007) reflected on the relationship
between privacy and control. They
argued that control and
privacy should be seen as separate
constructs and that privacy and control
serve very different
THE SOCIAL MEDIA PRIVACY
MODEL 8
functions. With the Restricted
Access/Limited Control (RALC) theory
of privacy, these authors
defined privacy in terms of the
individual’s protection from intrusion
and information gathering
by third parties. They argued that
control in the information age is
impossible and further that
“We can have control but no privacy,
and privacy but no control” (Tavani &
Moor, 2001, p. 6).
They suggested that privacy and control
should be separated such that privacy is
a concept and a
value that is defined by being protected
from information access by others,
whereas control is one
mechanism that can be used to manage
and justify privacy. Control may be
exerted through
choice, consent, or correction. In the
flow of the exchange of digital
information, people choose
situations according to their
communication goals, level of access,
and emerging privacy needs
(Trepte & Masur, 2017); then, privacy
is maintained through the processes of
consent, and
finally, corrections allow people to
restore their privacy when it gets lost or
threatened. For the
upcoming social media privacy model,
I will refer to this notion that control is
one mechanism
among others, and I will explain that
for all processes (i.e., choice, consent,
correction),
individuals have to get in touch with
others and communicate their motives
and aims.
With her theory of contextual integrity,
Nissenbaum (2010) also addressed the
contextual
requirements as boundary conditions,
regardless of whether control is a
functional mechanism or
not. She suggested that the two sets of
theories be married: those referring to
privacy as a
constraint on access and those referring
to privacy as a form of control. In her
theory of
contextual integrity, Nissenbaum
(2010) described control as one
“transmission principle” (p.
145) that defines how information is
exchanged. Other transmission
principles are reciprocity and
confidentiality. Control as a
transmission principle is appropriate
only if it fits into the particular
context, the subject that users are
talking about, the type of information
that is to be exchanged,
and the actors they communicate with.
From this point of view, there can be
privacy without
control in situations in which control is
inappropriate or not available (Laufer
& Wolfe, 1977;
THE SOCIAL MEDIA PRIVACY
MODEL 9
Slater, 2007).
Current privacy theories pushed the
idea of control as a dynamic attribute of
the situation
one crucial step further. According to
Dienlin’s (2014) privacy process
model, individuals assess
the controllability of the context and
behavior. Masur (2019) adds an
analysis of what is being
controlled by entangling interpersonal
(e.g., the relationship between
interaction partners) and
external factors (e.g., the architecture of
a room) in his theory of situational
privacy and self-
disclosure. Depending on the situation,
these interpersonal and external factors
can be controlled
to different degrees, and in turn, they
can elicit differential levels of self-
disclosure. Self-
disclosure will be understood as “the
intentional communication of
information about the self to
another person or group of people”
(Masur 2019, p. 70) in the remainder of
this article.
The notion that privacy and control are
not necessarily connected has been
supported by
previous research (Saeri, Ogilvie, La
Macchia, Smith, & Louis, 2014). For
example, Zlatolas,
Welzer, Heričko, and Hölbl (2015)
demonstrated that privacy norms,
policies, and awareness but
not privacy control were related to the
self-disclosures of N = 673 Slovenian
Facebook users. In
a U.S. sample of N = 249 Facebook
users, Taneja, Vitrano, and Gengo
(2014) found that
perceived behavioral control and the
intention to engage in privacy-related
behavior were
unrelated. Eastin et al. (2016)
investigated how different variables
predicted mobile commerce
activity and found that control was the
one that explained the smallest amount
of variance. In
particular, trust and attitude toward
mobile commerce were more important
predictors than
control. In sum, individuals who
perceived that they had control over
their personal data did not
necessarily feel they had more privacy
and did not increasingly engage in self-
disclosure. Further,
trust and norms were identified as
important alternative mechanisms of
privacy (Brandimarte et
al., 2013; Eastin et al., 2016;
Nissenbaum, 2010; Zlatolas et al.,
2015). I will refer to both
findings in the social media privacy
model.
THE SOCIAL MEDIA PRIVACY
MODEL 10
The Interplay of Affordances, Control,
and Privacy
The lack of a relation between privacy
and control might hint that the interplay
of the two
variables is not linear and is actually
more complex (Laufer & Wolfe, 1977).
The relation
between control and privacy should
become clearer if the social media
boundary conditions that
make control a functional mechanism
in one situation but impractical in
another are elucidated.
Social Media and its Boundary
Conditions for Privacy
Carr and Hayes (2015) defined social
media as “…Internet-based channels
that allow
users to opportunistically interact and
selectively self-present, either in real-
time or
asynchronously, with both broad and
narrow audiences who derive value
from user-generated
content and the perception of
interaction with others” (p. 50). They
further pointed out that users’
interaction will increasingly be
influenced by social media affordances.
Further, social media has
been characterized by its content, its
users, and its infrastructure in previous
definitions (Howard
& Parks, 2012). The most prominent
examples of social media are social
network sites (e.g.,
Facebook, Instagram, LinkedIn,
Google+), multimedia platforms (e.g.,
Youtube, Slideshare,
Soundcloud), weblogs (e.g., personal
diaries of mothers, scholars, or self-
appointed or paid
influencers), and microblogs (e.g.,
Twitter). In struggling to develop a
definition of social media,
scholars have pointed to the fact that
social media channels are formally
understood as methods
of mass communication but that they
primarily contain and perpetuate
personal user interactions
(Carr & Hayes, 2015; Papacharissi,
2010). In this sense, social media can
be referred to as
personal publics (Schmidt, 2014). As a
consequence, users cannot always
clearly define the
somewhat blurred lines between
personal and public or between private
and professional
communication. They feel that contexts
collapse and converge (Papacharissi,
2010; Vitak, 2012).
In sum, social media is characterized
by the following boundary conditions:
the content, its flow
and further uses (Howard & Parks,
2012); the communication practices
that users perceive as
THE SOCIAL MEDIA PRIVACY
MODEL 11
their options for exerting control or for
achieving privacy with other means;
and social media
affordances (Carr & Hayes, 2015). In
the following, I will analyze of how the
interplay of these
boundary conditions is related to
control and how it determines different
privacy perceptions and
behaviors. Tables 1 and 2 in the
Supplemental summarize this
theoretical development.
Social Media Boundary Condition 1:
Content, its Flow and Uses
What exactly does an individual strive
to control? The sooner we come to
understand
what individuals strive to control, the
better we can evaluate whether control
can be experienced
in social media. According to most
studies, personal information refers to
the content that people
strive to control in order to maintain
their privacy in social media. Metzger
(2004) referred to
personal information as the content to
be controlled. Quinn (2014) suggested
different layers of
how privacy can be maintained. On the
“content layer,” users’ experience of a
lack of control
leads them to limit the information they
post or even to post false information.
Sarikakis and
Winter (2017) added on the basis of
their qualitative work that users do not
differentiate between
personal information and personal data.
Instead, they define the degree of
intimacy or privacy
needed for a certain piece of
information or data.
Then, besides personal information, the
flow and use of the content needs to be
considered. Social media advocates
specifically address where online
information is forwarded,
archived, and sold. They emphasize
users’ concerns about how little control
they have over the
flow and use of personal information
(Marwick & boyd, 2014; Quinn, 2014;
Tsay-Vogel,
Shanahan, & Signorielli, 2018). This
refers to the forms personal
information takes, to where it
ends up and how it is used. In the
following, personal information, its
flow, and further uses will
be considered as what users strive to
control.
Social Media Boundary Condition 2:
Practices of Control
Actively exerting control expresses
informational self-determination,
which implies
THE SOCIAL MEDIA PRIVACY
MODEL 12
having power over information and
agency in decisions regarding this
information. In turn, a loss
of control would mean that other
behavioral options are out of reach and
that individuality (Buss,
2001), power, and agency are severely
threatened (Brummett & Steuber,
2015). Control also
comes along with risk avoidance:
Users have identified the most
important pieces of information
that they want to control as the contents
of their emails, the contents of their
online chats, and
their location (Cho, Lee, & Chung,
2010). As long as they have control
over this information,
they can avoid being harassed, bullied,
financially exploited by companies, or
surveilled by
governmental institutions.
How is control executed and achieved?
First, informational control can be
identified as an
individual’s perception that he or she
has a choice about whether to withhold
or disclose
information (Crowley, 2017; Johnson,
1974). Choice is the first step and
determines whether
control can be exerted and to what
degree (Wolfe & Laufer, 1974). Then,
in the next step, when
choice is available, it has to be put into
action. Two active practices of control
in social media are
consent and correction (Tavani, 2007).
Consent refers to the extent to which
users agree that a
certain piece of information will be
passed along. Correction means that
users are able to
withdraw from this agreement.
Whereas choice was identified long
ago as a practice of control
(Johnson, 1974), consent and correction
were suggested as social media
practices (Tavani, 2007).
Further, informational control can also
be put into practice by the selective
sharing of
information, self-censorship, audience
segregation, and encryption (Ochs &
Büttner, 2018). All
of these options are directed by the
individual and can be considered to be
ego-centered. It will be
important to also find terms for
interpersonal privacy regulation
behaviors.
Social Media Boundary Condition 3:
Affordances
Social media can be characterized by
affordances. The term affordance,
initially
suggested by Gibson (1979/2014),
means that the environmental
characteristics of a certain entity
THE SOCIAL MEDIA PRIVACY
MODEL 13
are not static but differently perceived,
experienced, and as such shaped by
humans. In the case of
social media, this understanding is
more than suitable. Of course the
process of shaping or
“furnishing” (Gibson, 1979/2014, p.
78) social media influences its further
uses. For example,
teenage users regulate their privacy
through social steganography, an
idiomatic language that is
understood only by their peers but not
their parents instead of managing their
audiences by
editing their friends lists and
systematically blocking their parents or
certain peers (boyd, 2014).
Inventing and using this kind of
idiomatic language might influence
users’ style of
communication and interactions on
social media. A selection of four
affordances have repeatedly
been shown to be particularly important
for social media: anonymity,
editability, association, and
persistence (boyd, 2008; Evans, Pearce,
Vitak, & Treem, 2017; Treem &
Leonardi, 2012). The
affordances of anonymity and
editability allow users to exert control.
By contrast, the affordances
of association and persistence
challenge users’ ability to exert
control. Both clusters of
affordances—those enhancing (Table
1) as well as those challenging control
(Table 2)—have
different implications for how content
is controlled, which practices of
control are available, and
how they affect privacy regulation in
social media realms. In the following, I
intertwine the
results from previous research on
privacy, the content and practices of
control, and, affordances.
Anonymity. The affordance of
anonymity describes the idea that other
social media
agents such as other private people,
institutions, or companies do not know
the source of the
message or the sender (Evans et al.,
2017). For social media use, being
unknown to others and
anonymously using social media is rare
(Rainie, Kiesler, Kang, & Madden,
2013). Nevertheless,
anonymity is still appreciated
occasionally. For example, users
commenting on other users’ posts
in self-help or political forums can
decide to keep their interactions
anonymous. In addition, users
of dating websites might at least partly
or temporarily use such sites
anonymously (Ramirez,
Bryant, Erin, Fleuriet, & Cole, 2015).
Anonymity is not a question of “on” or
“off” but is flexible
THE SOCIAL MEDIA PRIVACY
MODEL 14
and scalable (Evans et al., 2017).
Anonymity has spurred enormous
attention in research on
computer-mediated communication
(CMC; Joinson, 2001). Here,
anonymity is specifically
considered a means to control the
access of certain individuals (Qian &
Scott, 2007). Further,
while being online anonymously, an
individual can deliberately decide what
information to share,
with whom to share it, and what to
withhold (Qian & Scott, 2007). While
being online
anonymously, the receiver of a
message cannot provide the CMC with
face-to-face behaviors,
and as such, the control lies in the
hands of the sender (Ben-Ze'ev, 2003).
Control over content, its flow, and its
uses is possible because, in a state of
anonymity, all
of these are disconnected from the user.
Although full anonymity is not afforded
by social media,
U.S. American users occasionally keep
their posts anonymous while using
social media with the
clear aim of exerting control (Rainie et
al., 2013). For example, they disguise
their location or
delete cookies so that companies
cannot identify them. An interview
partner in Sarikakis and
Winters’ (2017) study said: “Well
when I use fake names or email
addresses and fake birthdates I
think that’s the only control you can
try to have” (p. 9). Two aspects,
though, might mitigate the
perception of control. First, users know
that they leave traces behind, and once
they have posted
content online—even if it was posted
anonymously—they might be traceable
because of other
information they left online; second,
anonymity is usually used only to some
extent (e.g., by
leaving one’s name but not one’s
address), and users acknowledge that
with partial anonymity,
they experience only partial control,
and in turn, only partial privacy (Rainie
et al., 2013). What
are the available practices users have to
exert control? First, being online
anonymously can be
considered a question of choice. Woo
(2016) suggested that anonymity—or
by contrast,
identifiability—is the most important
issue in online privacy. He argued that
on the social web,
users should have “islands” of
anonymity. He has even encouraged
people to lie and to have
secrets with the aim of regaining
control and autonomy. Then, however,
in an anonymous setting,
THE SOCIAL MEDIA PRIVACY
MODEL 15
consent and corrections are somewhat
disconnected from the user as these are
identity-related
practices.
Anonymity can be an enactment of
control by disguising or lying about
one’s identity or
by leaving it unspecified for some
applications and occasions. Also,
people may leave the source
of their own messages unknown. And,
of course, these enactments of control
might be applied
when interacting with some users but
not with others. In sum, the affordance
of anonymity is
related to informational control, which
has also been demonstrated in
empirical studies (Fox &
McEwan, 2017).
In previous research on privacy,
anonymity has played a crucial role.
Here, it was even
understood as a “type” of privacy
among other types such as solitude,
intimacy, or reserve
(Pedersen, 1999; Westin, 1967). In
sum, anonymity allows users to exert
control and, in turn, it
increases an individual’s subjective
experience of privacy by not being
identifiable at all or by
selectively presenting parts of one’s
own identity (Smith et al., 2011; Woo,
2016).
Editability. While using social media,
users interact remotely in terms of time
and space.
This gives them the chance to edit their
online communications before and
after the
communications are seen by others
(Treem & Leonardi, 2012). Editability
is an affordance that
was previously addressed as important
for CMC in the hyperpersonal model
(Walther, 1996):
Senders of online messages self-
selectively present themselves online
by primarily transmitting
cues and messages that they want
others to get to know and that put them
in a positive light. In
addition, although editing one’s identity
is part of any social interaction, social
media platforms
offer more freedom in terms of what,
how, and when these interactions are
edited. Editing allows
users to rehearse, change, package, or
literally craft a message and, in turn, to
rehearse, change,
and craft their personal appearance.
Editability allows the message sender
control over content and its flow and
uses because
THE SOCIAL MEDIA PRIVACY
MODEL 16
users have the chance to ponder the
consequences of their posts (Treem &
Leonardi, 2012).
Further, users may control the flow and
further use of their content by
articulating the lists of
online friends that they connect with or
by using a private or public profile
(Ellison & boyd,
2013). The availability of control
practices is highly supported by social
media’s affordance of
editability (Fox & McEwan, 2017).
Users have a choice to either intuitively
post their thoughts or
to pictorially represent their nonverbal
cues. Editing can be considered an
active enactment of
control because users deliberately
choose what to reveal to certain
audiences or what to withhold
(Crowley, 2017). Further, corrections
of one’s own posts and decisions are
possible and can also
be conceived as an enactment of
control (Crowley, 2017).
Control over the flow of information
might also foster subjective
experiences of privacy. In
privacy research, exerting control over
the flow of an interaction was often
understood
synonymously with control or as a
transmission principle that guaranteed
privacy (Nissenbaum,
2010).
Association. Social media platforms are
primarily used because they offer users
the chance
to connect with others and to stay in
touch. Users have articulated the idea
that communication is
their most important motive for using
social network sites (Quinn, 2016).
Consequently, the most
important affordance of social media is
the associations that are created or
maintained between
interaction partners (Ellison & boyd,
2013; Treem & Leonardi, 2012).
The affordance of association and the
chance to exert control over content, its
flow and
uses seem strikingly incompatible. In
their cross-sectional study of N = 875
Mechanical Turk
workers, Fox and McEwan (2017)
demonstrated that informational
control and network
association were negatively related.
Control is exerted by an individual
person, and as such, the
individual person is at the center if not
entirely responsible for achieving
privacy via control. This
is clearly expressed in users’ current
understanding of control. For example,
participants in
THE SOCIAL MEDIA PRIVACY
MODEL 17
Sarikakis and Winters’ (2017) study
identified the individual as the
legitimate “controller” of
privacy (p. 6). Also, boyd (2014)
argued that exerting control with the
aim of achieving privacy
requires the individual’s full
consideration, power, knowledge, and
skills. The only control
practice available would be to
completely withdraw (i.e., not to
participate) and to accept the
disadvantages that come along with
such a decision. Then, ambiguous and
passive enactments of
control might be used, but these have
the same disadvantages.
In sum, control as an issue and a
concept takes the perspective of the
individual. In other
words, it is an “ego-centered need”
(Papacharissi, 2010, p. 144). By
contrast, association is an
interindividual concept. Very likely,
one person’s control ends at the point
where another
person’s control starts. Hence, as much
as the social web is an interdependent
realm involving
other parties, control is not the means
to ensure or exert privacy. On the basis
of these
considerations—and this will be
important for the upcoming
propositions on social media
privacy—other means and mechanisms
to guarantee the subjective experience
of privacy have
become necessary: Users communicate
with each other to ensure privacy. And
further, if
communication is not possible, they
choose communication partners—
individuals, organizations,
institutions—that they can trust. Trust
has been shown to be crucial for
ensuring the perception of
privacy and subsequent online
disclosure (Metzger, 2004). Trust can
be established by personal
communication (Petronio, 2002) and by
norms that the individual user can rely
on (Saeri et al.,
2014). As such, in the upcoming
propositions and the social media
privacy model, trust,
communication, and norms are
conceptualized as the core mechanisms
to ensure privacy beyond
control.
Persistence. The affordance of
persistence addresses the durability of
online expressions
and content (boyd, 2014) and the idea
that after personal information is
published online, it is
automatically recorded and archived
and is consequently replicable (boyd,
2008). It means that
THE SOCIAL MEDIA PRIVACY
MODEL 18
data remain accessible in the same
form over long periods of time and for
diverse and unforeseen
audiences (Evans et al., 2017; Treem &
Leonardi, 2012). Some authors have
emphasized the
positive outcomes that persistence may
have, namely, that it allows knowledge
to be sustained,
creates robust forms of
communication, establishes the growth
of content (Treem & Leonardi,
2012), and allows for authenticity and
longevity (boyd, 2008).
However, as persistence comprises the
endless and infinite nature of online
data, it also
expresses a loss of informational self-
determination. Persistence seems
incompatible with
control. It evokes the idea that control
over content, its flow, and uses is
impossible because once
personal information is posted online,
it is no longer under the sender’s
control. With regard to
control practices, social media users
have the choice to completely
withdraw from online
interactions, to not post their
information, and thus to avoid its
persistence. At the same time, this
would prevent them from receiving the
myriad benefits that come along with
data sharing and
thus does not seem to be a question of
freedom of choice anymore. Also, as
soon as the choice is
made to participate, other control
practices such as consent and correction
significantly decrease.
Finally, once given, consent decreases
a person’s chances to subsequently
correct previous online
communication. Users know that they
do not have control over how persistent
their data will be,
and this significantly decreases their
subjective experience of privacy
(Rainie et al., 2013). The
lack of control over personal
information due to the persistence of
all kinds of online information
can be considered one of the key issues
of online lives and the subjective
experience of privacy
today. To react to users’ needs to
foresee and understand persistence
(Rainie et al., 2013),
communication, trust and norms seem
to be adequate ways to ensure privacy.
The Social Media Privacy Model
Social media privacy is based on
interpersonal processes of mutual
disclosure and
communication (Altman, 1975;
Petronio, 2002). Further, it can be
considered a value that is co-
THE SOCIAL MEDIA PRIVACY
MODEL 19
developed by engaging in
communication and that is expressed
by a shared perception
(Nissenbaum, 2010; Smith et al.,
2011). On the basis of these
considerations, I propose:
Proposition 1: Privacy is
interdependently perceived and valued.
In contrast to privacy, control is at the
center of the individual person. Control
is exerted
by the individual person, and it can be
considered to be ego-centered
(Papacharissi, 2010;
Sarikakis & Winter, 2017). Social
media platforms aim for
connectedness, interdependence, and
sociality and can be described as
social-centered (Ellison & boyd, 2013).
As a consequence,
social media privacy cannot be
sufficiently achieved by exerting
control. Other mechanisms are
necessary to ensure social media
privacy.
Proposition 2: Privacy cannot be
satisfactorily achieved by exerting
control in social media.
Instead of striving for control as an
end-game of privacy, the opposite is
necessary to
experience privacy in social media.
Users need to constantly communicate
with each other as
well as with institutions and companies
to ensure their privacy. They need to
engage in both
interpersonal and deliberative
communication processes.
Interpersonal communication is
understood as interactions between
users as well as interactions between
the individual user and
others who represent third parties such
as institutions and companies.
Deliberation is defined as
either informal or institutionalized
interaction among internet users (and
eventually
representatives of governments,
institutions, or companies), involving
rational-critical decision
making and the earnest aim to find a
solution (Burkhalter, Gastil, &
Kelshaw, 2002).
Proposition 3: Interpersonal
communication is a mechanism by
which social media privacy can
interdependently ensured and put into
effect.
However, not all actions and steps of
online media use can be accompanied
by
communication processes. For many if
not most questions of privacy, people
can rely on past
experiences. Here, communication and
deliberation crystallize into a stable
relationship-based
THE SOCIAL MEDIA PRIVACY
MODEL 20
result or even solution, i.e. trust (or
mistrust) and norms (or anomia). Trust
can be defined as an
anticipation and expectation of how a
person or institution will behave and as
such reduces
uncertainty (Waldman, 2018, p. 4).
Trust has been shown to ensure privacy
on the basis of
longstanding communication and
reliable bonds (Saeri et al., 2014). Trust
can be conceived as
both a crucial factor of influence in
decisions over self-disclosure and as a
result of
communication (Saeri et al., 2014). In
turn, mistrust—which has not yet been
addressed in
privacy research—is a menace to
privacy and as such should initiate
communication. In a
qualitative study on privacy
perceptions, Teutsch, Masur, and
Trepte (2018) demonstrated that
participants perceived that they had lost
control and had substituted trust for
control. One of the
interview partners said, “Well, privacy
is absolute trust between conversational
partners and
absolute, absolute certainty that the
subject of conversation will stay within
this sphere” (p. 7).
Eichenhofer (2019) suggested that the
“trust paradigm” should point toward a
more current
perspective on privacy regulation via
trust in contrast to privacy regulation
via control or self-
determination.
In the case of privacy regulation, both
social and legislative norms come into
play (Gusy,
2018; Spottswood & Hancock, 2017;
Utz & Krämer, 2009). Social norms are
understood as
social pressure to engage in a certain
kind of behavior and are established by
what others approve
of (injunctive norms) and what they
actually do (descriptive norms) (Saeri
et al., 2014). Although
legislative norms are coined by
jurisprudence, regulated by law (and
not on the basis of observing
others), they share with social norms in
that they prescribe a certain behavior
and that they allow
for sanctions in case this behavior is
not shown. Previous research has
shown that trust, and
norms are the keys to obtaining privacy
(Marwick & boyd, 2014; Quinn, 2014).
To establish trust
and norms, of course, communication
is necessary.
Proposition 4: Trust and norms
function as privacy mechanisms that
represent
THE SOCIAL MEDIA PRIVACY
MODEL 21
crystallized privacy communication.
I suggest that control and
communication have found a new
balance in social media
communication: Control is losing
control and communication is gaining
power. In other words,
users do not solely rely on and strive
for control in the first place but strive to
communicate about
privacy to establish norms and trust
and even sometimes to regain control.
In fact, privacy’s
interdependence is expressed and put
forward by interpersonal
communication. This emerging
social turn in privacy theory is also
acknowledged in the definition of
privacy.
I define privacy by an individual’s assessments
of (a) the level of access to this
person in an interaction or relationship with
others (people, companies,
institutions) and (b) the availability of the
mechanisms of control,
interpersonal communication, trust, and norms
for shaping this level of access
through (c) self-disclosure as (almost intuitive)
behavioral privacy regulation
and (d) control, interpersonal communication,
and deliberation as means for
ensuring (a somewhat more elaborated)
regulation of privacy. In social media,
then, the availability of the mechanisms that
can be applied to ensure privacy
are crucially influenced by the content that is
being shared and the social media
affordances that determine how this content is
further used.
In the following, I will summarize the
theoretical rationale developed in this
article in the
chronology of a communicative
process. Further, I will show how the
individual’s privacy
assessments referred to in the four
propositions eventually lead to different
forms of privacy
regulation behaviors. The process is
illustrated in Figure 1. The following
steps are meant to
make the model accessible for
empirical investigation.
The first part of the flowchart refers to
the social media user’s subjective and
initial
assessment: All humans have
individual levels of access they
perceive as being more or less
adequate and comfortable. This
dispositional level of access is a
quantifiable dimension varying
THE SOCIAL MEDIA PRIVACY
MODEL 22
between high and low levels and
expressing the individual’s
dispositional willingness to self-
disclose. In contrast, the
communication goal is rooted in the
situation and defines what is to be
achieved in this particular situation.
The individual’s communication goals
in social media are
manifold and important to consider
when assessing online privacy. They
can most likely be
understood as a qualitative scenario of
what the individual user wants and
needs to communicate
in a certain situation. Hence, the point
of departure for each and any
consideration about privacy
are the more or less consciously asked
questions: How do I feel, what do I
need, and what is my
goal for this particular situation?
The second part of the model refers to
the social media boundary conditions
that are
encountered. Here, content and
affordances dynamically interact (as
indicated with the
multiplication sign) with an
individual’s initial assessment. The
individual weighs the ideal level
of access and his/her communication
goals against social media boundary
conditions by
considering what content is shared,
where; how it might flow from one user
or institution to
another; and, how it might be used.
Social media content becomes dynamic
as it is displayed and
shared. Affordances represent this
dynamic and, together with the
individual’s dispositions and
goals, shape the available privacy
mechanisms: control, trust, norms, and
interpersonal
communication. Hence, users assess
whether they have a choice, whether
they can rely on trust or
norms, or whether they will (have to)
engage in interpersonal
communication.
The third part of the model refers to the
subjective experience of privacy. The
individual
experiences a certain level of access
that results from the individual’s goals
on the one hand and
the social media boundary conditions
and privacy mechanisms that are
applied to actively
regulate privacy on the other. This
experience is here understood as the
rather unfiltered
accumulation of external stimuli and
internal needs and, then, results in a
more elaborated re-
assessment, i.e. the privacy perception
that can be verbalized, and is
empirically accessible.
THE SOCIAL MEDIA PRIVACY
MODEL 23
The privacy perception results in
different forms of privacy regulation
behaviors. First,
self-disclosure belongs to the most
intuitive regulation behaviors and
includes all information
intentionally shared (or not shared)
with others. And, for the case that the
privacy mechanism of
control is available and considered
adequate for a certain communication
goal, users exert control
actively and intentionally by restricting
access to information, audience
segregation, self-
censorship, encryption; or, rather
ambiguously by softening the truth,
obfuscating information.
When the privacy mechanisms do not
allow the deliberate and somewhat
egocentric privacy
regulation (i.e. when individuals do not
have at least partial control), other
regulation behaviors
come into play. The individual might
engage in interpersonal communication
or even deliberation
to negotiate and interdependently
regulate privacy. Interpersonal
communication, deliberation,
and control are meta-level regulation
behaviors that come into play when
privacy behaviors are
not intuitive, when contexts collapse, or
when a certain situation demands
further elaboration
and/or communication.
The communication process shown in
Figure 1 will take turns in a constant
flow of
assessments and re-assessments as soon
as the reactions of others lead to
changes in conditions or
when personal goals or needs change.
In what follows, I will discuss the
capabilities and
consequences of the model’s theoretical
propositions. What are the possible
effects and what are
the pitfalls and opportunities that will
occur if communication, trust, and
norms are substituted
for control?
Challenging the Social Media Privacy
Model
Social media use is at the heart of
human communication and offers all of
its many merits
such as practicing freedom of speech or
reaching out to people in critical living
conditions and
providing them with social support. In
addition, social media communication
offers particular
benefits because it is ubiquitous and
independent of time and location. As
such, for example, it
THE SOCIAL MEDIA PRIVACY
MODEL 24
allows people to communicate across
borders into the lands of friends, foes,
and fiends. All of
these merits of online communication
give its users enormous freedom. This
freedom is—and
this is again one of the major merits of
social media communication—often
independent of the
domestic or cultural situation of the
user. Freedom is historically closely
linked to privacy. In
mediaeval times, only those who had
land were free (Moore, 1984). And
only those who
possessed land had the freedom to
withdraw, grant, or restrict access to
their land. In turn, the
“unfree,” who did not have their own
land or possessions, did not have the
right to privacy. In
this sense, freedom is the origin both in
legislation and the genealogy of privacy
(Gusy, 2018).
For social media and online realms,
freedom and privacy are closely
connected. However,
the ideas of possessions and ownership
do not seem fully applicable anymore.
Borders and
possessions have become fuzzy
because, very often, data and personal
information are perceived
as shared goods as soon as they appear
online. Nissenbaum (2010)
summarized her work on
contextual integrity with the words:
“We have a right to privacy, but it is
neither a right to control
personal information nor a right to
have access to this information
restricted” (p. 231). She
conceded that for social media, privacy
is rather a value and a perception.
Borders and possessions have a
permanent nature. By contrast, values
and perception are
subject to interpretation and change.
This understanding of privacy as
subject to interpretation
and change makes it a matter of
communication. Eventually,
communication about privacy will
result in trust, and if successfully
shared in a society, in social and
legislative norms. However,
communication cannot be understood
as a long and painful process that will
finally show the
solution and lead us into the light of
privacy. Just the opposite is the case. In
social media,
communication about data sharing and
the use and flow of interpersonal data
is the solution itself.
Only due to ongoing and critical
assessment, reassessment, and dynamic
communication will we
have the chance to ensure privacy as
one of the most important values of
civilized societies.
THE SOCIAL MEDIA PRIVACY
MODEL 25
Privacy’s interdependence is expressed
and put forward by interpersonal
communication.
And in fact, privacy is experiencing a
“social turn”, in social media and
beyond (Helm &
Eichenhofer, 2019). This emerging
social turn in privacy theory is
acknowledged in the social
media privacy model. However, there
are downsides to a conception of
privacy as a
communicative process. First, not all
members of this communicative
process will have equal
chances of being heard. For example,
digital and literacy gaps have been
shown to crucially
influence participation in these
communication processes (Helsper,
2017).
Second, online information has become
a commodified good, and financial
interests are
stark (Sevignani, 2016). Purposeless
communication among friends is
increasingly exploited for
economic reasons (Seubert & Becker,
2019); and, companies do their best to
avoid interactions
between individual users who strive to
regulate their privacy. In turn, they try
to establish trust
through strong branding activities that
have been shown to override social
media users’ privacy
concerns and their interest in solving
privacy issues by actively
communicating and participating
(Boerman, Kruikemeier, & Zuiderveen
Borgesius, 2018; Li, 2011).
Third, as a consequence, the
requirements of communication and
trust demand a lot from
users. Control implies a settled state in
which the individual person can lie
back and stop thinking
about the flow of personal information
online. Communication, trust, and
norms, by contrast, are
subject to change and thus require
constant assessment and consideration.
Hence, information
control should also be considered with
regard to an individual’s ability to
exert control (Grimm &
Bräunlich, 2015). The users-centered
perspective needs to be complemented
and accompanied
with a system based perspective and
respective interventions (Schäwel,
2020).
Fourth, this demanding process of
communication might also result in a
threat to the
ability to develop a self-determined
identity. Westin (1967) held that
boundary control means
identity control: “This deliberate
penetration of the individual’s
protective shell, his
THE SOCIAL MEDIA PRIVACY
MODEL 26
psychological armor, would leave him
naked to ridicule and shame and would
put him under the
control of those who knew his secrets”
(p. 33). As a consequence, lost control
would mean
threats to identity development.
Finally, I embrace the demanding and
somewhat stressful nature of
communicating about
privacy in social media. In social
media, there is only limited control
over personal information.
In addition, the handling of this lack of
control is demanding and stressful. By
acknowledging
these two observations, users will
acknowledge that they need to take
action, engage in
communication, and establish trust and
shared social and legislative norms. In
social media,
privacy is not a private affair. It is at
the center of communication. We are
out of control because
we have so much to share. Hence,
interpersonal communication, trust,
and norms are the three
most important mechanisms that
interdependently help to ensure social
media privacy.
THE SOCIAL MEDIA PRIVACY
MODEL 27
References
Altman, I. (1975). The environment
and social behavior: Privacy, personal
space, territory,
crowding. Monterey, CA:
Brooks/Cole Publishing Company.
Anderson, J., Rainie, L., & Duggan,
M. (2014). Digital life in 2025.
Retrieved from
http://www.pewinternet.org/
2014/03/11/digital-life-in-2025/
Ben-Ze'ev, A. (2003). Privacy,
emotional closeness, and openness in
cyberspace. Computers
in Human Behavior, 19(4), 451–567.
https://doi.org/10.1016/S0747-
5632(02)00078-X
Boerman, S. C., Kruikemeier, S., &
Zuiderveen Borgesius, F. J. (2018).
Exploring
motivations for online privacy
protection behavior: Insights from
panel data.
Communication Research, 25.
https://doi.org/10.1177/009365021880
0915
Boyd, d. (2008). Taken out of context.
American teen sociality in networked
publics (Doctoral
dissertation). University of California,
Berkeley.
Boyd, d. (2014). It's complicated. The
social lives of networked teens. New
Haven, CT: Yale
University Press.
Brandimarte, L., Acquisti, A., &
Loewenstein, G. (2013). Misplaced
confidences: Privacy and
the control paradox. Social
psychological and personality science,
4(3), 340–347.
https://doi.org/
10.1177/1948550612455931
Brummett, E. A., & Steuber, K. R.
(2015). To reveal or conceal? Privacy
management
processes among interracial romantic
partners. Western Journal of
Communication, 79(1),
22–44.
https://doi.org/10.1080/10570314.201
4.943417
Burgoon, J. K. (1982). Privacy and
communication. Communication
Yearbook, 6(4), 206–
249. https://doi.org/10.1080/23808985
Burkhalter, S., Gastil, J., & Kelshaw,
T. (2002). A conceptual definition and
theoretical model
of public deliberation in small face-to-
face groups. Communication Theory,
12(4), 398–
422.
https://doi.org/10.1093/ct/12.4.398
Buss, A. (2001). Psychological
dimensions of the self. Thousand
Oaks, CA: Sage.
THE SOCIAL MEDIA PRIVACY
MODEL 28
Carr, C. T., & Hayes, R. A. (2015).
Social media: defining, developing,
and divining. Atlantic
Journal of Communication, 23(1), 46–
65.
https://doi.org/10.1080/15456870.201
5.972282
Cho, H., Lee, J.-S., & Chung, S.
(2010). Optimistic bias about online
privacy risks: Testing
the moderating effects of perceived
controllability and prior experience.
Computers in
Human Behavior, 26, 987–995.
https://doi.org/10.1016/j.chb.2010.02.
012
Crowley, J. L. (2017). A framework
of relational information control: A
review and extension
of information control research in
interpersonal contexts.
Communication Theory, 27(2),
202–222.
https://doi.org/10.1111/comt.12115
Derlega, V. J., Metts, S., Petronio, S.,
& Margulis, S. T. (1993). Self-
disclosure. Sage series
on close relationships. Newbury Park,
CA: Sage Publications.
Dienlin, T. (2014). The privacy
process model. In S. Garnett, S. Halft,
M. Herz, & J. M.
Mönig (Eds.), Medien und Privatheit
[Media and privacy] (pp. 105–122).
Passau,
Germany: Karl Stutz.
Eastin, M. S., Brinson, N. H., Doorey,
A., & Wilcox, G. (2016). Living in a
big data world:
predicting mobile commerce activity
through privacy concerns. Computers
in Human
Behavior, 58, 214–220.
https://doi.org/10.1016/j.chb.2015.12.
050
Eichenhofer, J. (2019). e-Privacy -
Theorie und Dogmatik eines
europäischen
Privatheitsschutzes im Internet-
Zeitalter [Theoretical and doctrinal
foundations of a
European privacy protection
regulation in the internet age].
Bielefeld: University of
Bielefeld.
Ellison, N. B., & boyd, d. (2013).
Sociality through social network sites.
In W. H. Dutton
(Ed.), The Oxford handbook of
Internet studies (pp. 151–172).
Oxford, UK: Oxford
University Press.
European Commission. (2015).
Special Eurobarometer 431: Data
protection. Brussels, BE.
Retrieved from
http://ec.europa.eu/public_opinion/arc
hives/ebs/ebs_431_en.pdf
THE SOCIAL MEDIA PRIVACY
MODEL 29
Evans, S. K., Pearce, K. E., Vitak, J.,
& Treem, J. W. (2017). Explicating
affordances: A
conceptual framework for
understanding affordances in
communication research. Journal
of Computer-Mediated
Communication, 22(1), 35–52.
https://doi.org/10.1111/jcc4.12180
Fox, J., & McEwan, B. (2017).
Distinguishing technologies for social
interaction: The
perceived social affordances of
communication channels scale.
Communication
Monographs, 84(3), 298–318.
https://doi.org/10.1080/03637751.201
7.1332418
Gibson, J. J. (2014). The ecological
approach to visual perception.
Psychology Press &
Routledge Classic Editions. Hoboken,
NJ: Taylor and Francis (Original work
published
1979).
Grimm, R., & Bräunlich, K. (2015).
Vertrauen und Privatheit [Trust and
privacy]. DuD
Datenschutz und Datensicherheit
[Data protection and data security], 5,
289–294.
Gusy, C. (2018). Datenschutz als
Privatheitsschutz oder Datenschutz
statt Privatheitsschutz?
[Data protection as privacy protection
or privacy protection as data
protection?].
Europäische Grundrechte Zeitschrift
[European Fundamental Rights
Journal], 45(9-12),
244–255.
Helm, P., & Eichenhofer, C. (2019).
Reflexionen zu einem social turn in
den privacy studies.
In C. Aldenhoff, L. Edeler, Hennig,
M., Kelsch, J., L. Raabe, & F. Sobala
(Eds.),
Digitalität und Privatheit [Digitality
and Privacy] (pp. 139–166). Bielefeld,
Germany:
transcript.
https://doi.org/10.14361/97838394466
14-009
Helsper, E. J. (2017). The social
relativity of digital exclusion:
Applying relative deprivation
theory to digital inequalities.
Communication Theory, 27(3), 223–
242.
https://doi.org/10.1111/comt.12110
Howard, P. N., & Parks, M. R. (2012).
Social media and political change:
Capacity,
constraint, and consequence. Journal
of Communication, 62(2), 359–362.
https://doi.org/10.1111/j.1460-
2466.2012.01626.x
THE SOCIAL MEDIA PRIVACY
MODEL 30
Johnson, C. A. (1974). Privacy as
personal control. In S. T. Margulis
(Ed.), Man-environment
interactions: Evaluations and
applications (pp. 83–100).
Stroudsburg, PA: Dowden,
Hutchinson & Ross.
Joinson, A. N. (2001). Self-disclosure
in computer-mediated
communication: The role of self-
awareness and visual anonymity.
European Journal of Social
Psychology, 31(2), 177–192.
https://doi.org/10.1002/ejsp.36
Laufer, R. S. [R. S.], & Wolfe, M.
(1977). Privacy as a concept and a
social issue: A
multidimensional developmental
theory. Journal of Social Issues, 33(3),
22–42.
https://doi.org/10.1111/j.1540-
4560.1977.tb01880.x
Li, Y. (2011). Empirical studies on
online information privacy concerns:
Literature review
and an integrative framework.
Communications of the Association
for Information Systems,
28(1), 453–496. Retrieved from
http://aisel.aisnet.org/cais/vol28/iss1/2
8
Madden, M. (2014). Public
perceptions of privacy and security in
the post-Snowden era.
Retrieved from
http://www.pewinternet.org/2014/11/1
2/public-privacy-perceptions/
Madden, M., & Rainie, L. (2015).
Americans’ attitudes about privacy,
security and
surveillance. Retrieved from
http://www.pewinternet.org/2015/05/2
0/americans-attitudes-
about-privacy-security-and-
surveillance/
Marwick, A. E., & boyd, d. (2014).
Networked privacy. How teenagers
negotiate context in
social media. New Media & Society,
16(7), 1051–1067.
https://doi.org/
10.1177/1461444814543995
Masur, P. K. (2019). Situational
privacy and self-disclosure:
Communication processes in
online environments. Cham,
Switzerland: Springer International
Publishing.
Metzger, M. J. (2004). Privacy, trust,
and disclosure: Exploring barriers to
electronic
commerce. Journal of Computer-
Mediated Communication, 9(4).
https://doi.org/10.1111/j.1083-
6101.2004.tb00292.x
THE SOCIAL MEDIA PRIVACY
MODEL 31
Moor, J. H. (1997). Towards a theory
of privacy in the information age.
ACM SIGCAS
Computers and Society, 27(3), 27–32.
https://doi.org/10.1145/270858.27086
6
Moore, B. (1984). Privacy: Studies in
social and cultural history. Armonk,
N.Y.: M.E.
Sharpe.
Nissenbaum, H. (2010). Privacy in
context: Technology, policy, and the
integrity of social
life. Palo Alto, CA: Stanford
University Press.
Ochs, C., & Büttner, B. (2018). Das
Internet als "Sauerstoff" und
"Bedrohung" [The internet
as oxygen and menace]. In M.
Friedewald (Ed.), DuD-Fachbeiträge.
Privatheit und
selbstbestimmtes Leben in der
digitalen Welt [Privacy and a self-
determined life in a
digital world] (pp. 33–80).
Wiesbaden, Germany: Springer
Vieweg.
Papacharissi, Z. (2010). A private
sphere: Democracy in a digital age.
Cambridge: Polity
Press.
Pedersen, D. M. (1999). Model for
types of privacy by privacy functions.
Journal of
Environmental Psychology, 19, 397–
405.
https://doi.org/10.1006/jevp.1999.014
0
Petronio, S. (2002). Boundaries of
privacy. Albany, NY: State University
of New York Press.
Qian, H., & Scott, C. R. (2007).
Anonymity and self-disclosure on
weblogs. Journal of
Computer-Mediated Communication,
12(4), 1428-1451.
https://doi.org/10.1111/j.1083-
6101.2007.00380.x
Quinn, K. (2014). An ecological
approach to privacy: “Doing” online
privacy at midlife.
Journal of Broadcasting & Electronic
Media, 58(4), 562–580.
https://doi.org/
10.1080/08838151.2014.966357
Quinn, K. (2016). Why we share: A
uses and gratifications approach to
privacy regulation in
social media use. Journal of
Broadcasting & Electronic Media,
60(1), 61–86.
https://doi.org/
10.1080/08838151.2015.1127245
THE SOCIAL MEDIA PRIVACY
MODEL 32
Rainie, L., Kiesler, S., Kang, R., &
Madden, M. (2013). Anonymity,
privacy, and security
Online. Retrieved from
http://www.pewinternet.org/2013/09/0
5/anonymity-privacy-and-
security-online/
Ramirez, A., Bryant, S., Erin, M.,
Fleuriet, C., & Cole, M. (2015). When
online dating
partners meet offline: The effect of
modality switching on relational
communication
between online daters. Journal of
Computer-Mediated Communication,
20(1), 99–114.
https://doi.org/10.1111/jcc4.12101
Saeri, A. K., Ogilvie, C., La Macchia,
S. T., Smith, J. R., & Louis, W. R.
(2014). Predicting
facebook users' online privacy
protection: Risk, trust, norm focus
theory, and the theory of
planned behavior. The Journal of
Social Psychology, 154(4), 352–369.
https://doi.org/
10.1080/00224545.2014.914881
Sarikakis, K., & Winter, L. (2017).
Social media users’ legal
consciousness about privacy.
Social Media + Society, 3(1), 1-14.
https://doi.org/10.1177/205630511769
5325
Schäwel, J. (2020). How to raise
users’ awareness of online privacy.
Duisburg, Germany:
University of Duisburg-Essen.
Schmidt, J.-H. (2014). Twitter and the
rise of personal publics. In K. Weller,
A. Bruns, J.
Burgess, M. Mahrt, & C. Puschmann
(Eds.), Digital formations: Vol. 89.
Twitter and
society (pp. 3–14). New York: Peter
Lang.
Seubert, S., & Becker, C. (2019). The
culture industry revisited:
Sociophilosophical
reflections on ‘privacy’ in the digital
age. Philosophy & Social Criticism,
45(8), 930–947.
https://doi.org/
10.1177/0191453719849719
Sevignani, S. (2016). Privacy and
capitalism in the age of social media.
Routledge research
in information technology and society:
Vol. 18. New York, NY: Routledge.
Slater, M. D. (2007). Reinforcing
spirals: The mutual influence of media
selectivity and
media effects and their impact on
individual behavior and social
identity. Communication
Theory, 17(3), 281–303.
https://doi.org/10.1111/j.1468-
2885.2007.00296.x
THE SOCIAL MEDIA PRIVACY
MODEL 33
Smith, H. J., Dinev, T., & Xu, H.
(2011). Information privacy research:
an interdisciplinary
review. Mis Quarterly, 35(4), 989–
1016.
Spottswood, E. L., & Hancock, J. T.
(2017). Should I share that?
Prompting social norms that
influence privacy behaviors on a
social networking site. Journal of
Computer-Mediated
Communication, 22(2), 26.
https://doi.org/10.1111/jcc4.12182
Taneja, A., Vitrano, J., & Gengo, N. J.
(2014). Rationality-based beliefs
affecting individual’s
attitude and intention to use privacy
controls on facebook: An empirical
investigation.
Computers in Human Behavior, 38,
159–173.
https://doi.org/10.1016/j.chb.2014.05.
027
Tavani, H. T. (2007). Philosophical
theories of privacy: Implications for
an adequate online
privacy policy. Metaphilosophy,
38(1), 1–22.
https://doi.org/10.1111/j.1467-
9973.2006.00474.x
Tavani, H. T., & Moor, J. H. (2001).
Privacy protection, control of
information, and privacy-
enhancing technologies. ACM
SIGCAS Computers and Society,
31(1), 6–11.
https://doi.org/
10.1145/572277.572278
Teutsch, D., Masur, P. K., & Trepte,
S. (2018). Privacy in mediated and
nonmediated
interpersonal communication: How
subjective concepts and situational
perceptions
influence behaviors. Social Media +
Society, 4(2), 1-14.
https://doi.org/
10.1177/2056305118767134
Treem, J. W., & Leonardi, P. M.
(2012). Social media use in
organizations. Exploring the
affordances of visibility, editability,
persistence, and association.
Communication
Yearbook, 36, 143–189.
https://doi.org/10.1080/23808985.201
3.11679130
Trepte, S., & Masur, P. K. (2017).
Need for privacy. In Zeigler-Hill, V.,
Shakelford, T. K.
(Ed.), Encyclopedia of personality and
individual differences. London, UK:
Springer.
https://doi.org/10.1007/978-3-319-
28099-8_540-1
Trepte, S., & Reinecke, L. (Eds.).
(2011). Privacy online. Perspectives
on privacy and self-
disclosure in the social web. Berlin,
Germany: Springer.
THE SOCIAL MEDIA PRIVACY
MODEL 34
Trepte, S., Reinecke, L., Ellison, N.
B., Quiring, O., Yao, M. Z., &
Ziegele, M. (2017). A
cross-cultural perspective on the
privacy calculus. Social Media +
Society, 3(1), 1-13.
https://doi.org/
10.1177/2056305116688035
Tsay-Vogel, M., Shanahan, J., &
Signorielli, N. (2018). Social media
cultivating perceptions
of privacy: A 5-year analysis of
privacy attitudes and self-disclosure
behaviors among
Facebook users. New Media &
Society, 20(1), 141–161.
https://doi.org/
10.1177/1461444816660731
Utz, S., & Krämer, N. (2009). The
privacy paradox on social network
sites revisited. The role
of individual characteristics and group
norms. Journal of Psychosocial
Research on
Cyberspace, 3(2). Retrieved from
http://cyberpsychology.eu/view.php?
cisloclanku=2009111001&article=2
Vitak, J. (2012). The impact of
context collapse and privacy on social
network site
disclosures. Journal of Broadcasting &
Electronic Media, 56(4), 451–470.
https://doi.org/
10.1080/08838151.2012.732140
Waldman, A. E. (2018). Privacy as
Trust. Cambridge, UK: Cambridge
University Press.
https://doi.org/
10.1017/9781316888667
Walther, J. B. (1996). Computer-
mediated communication. Impersonal,
interpersonal, and
hyperpersonal interaction.
Communication Research, 23(1), 3–
43.
https://doi.org/
10.1177/009365096023001001
Warren, S. D., & Brandeis, L. D.
(1890). The right to privacy. Harvard
Law Review, 4(5),
193–220.
Westin, A. F. (1967). Privacy and
freedom. New York, NY: Atheneum.
Wolfe, M., & Laufer, R. (1974). The
concept of privacy in childhood and
adolescence. In S.
T. Margulis (Ed.), Man-environment
interactions: Evaluations and
applications (pp. 29–
54). Stroudsburg, PA: Dowden,
Hutchinson & Ross.
THE SOCIAL MEDIA PRIVACY
MODEL 2
Abstract
Privacy has been defined as the
selective control of information
sharing, where control is key. For
social media, however, an individual
user’s informational control has
become more difficult. In
this theoretical article, I review how
the term control is part of theorizing on
privacy, and I
develop an understanding of online
privacy with communication as the
core mechanism by which
privacy is regulated. The results of this
article’s theoretical development are
molded into a
definition of privacy and the social
media privacy model. The model is
based on four
propositions: Privacy in social media is
interdependently perceived and valued.
Thus, it cannot
always be achieved through control. As
an alternative, interpersonal
communication is the
primary mechanism by which to ensure
social media privacy. Finally, trust and
norms function as
mechanisms that represent crystallized
privacy communication. Further
materials are available at
https://osf.io/xhqjy/
Keywords: privacy, control, social
media, affordances, communication,
social media
privacy model, definition of privacy
THE SOCIAL MEDIA PRIVACY
MODEL 3
The Social Media Privacy Model:
Privacy and Communication in the
Light of Social Media
Affordances
In historical and current theories about
privacy, control has been perceived as
an
important defining term. The majority
of privacy scholars understand control
as the means by
which to regulate and finally
experience privacy (Altman, 1975;
Burgoon, 1982; Petronio, 2002).
The underlying assumption is that the
more users can control access to their
personal lives, or—
more technically—to their data, the
more privacy they experience. Also, the
most current
understanding held by social media
users is that they need control to
achieve privacy and
informational self-determination
(Marwick & boyd, 2014). The majority
of 80% to 90% of U.S.
Americans (Madden & Rainie, 2015)
and Europeans (European Commission,
2015) say that it is
important to them to be in control of
determining who can obtain
information about them and
what information is collected about
them (see also Sarikakis & Winter,
2017).
There is no question that users face
decreasing informational control while
communicating via social media. Due
to their networked nature, social media
applications do not
allow users to control what friends,
acquaintances, institutions, or
companies do with the
information, pictures, and stories that
are shared online (Marwick & boyd,
2014). Further, social
media communication takes place in
larger and larger parts of users’ lives.
All of the more
current applications and devices
aggregate information and exert
automatic control (Anderson,
Rainie, & Duggan, 2014). As a reaction
to the increasing amounts of data that
are exchanged and
the sociality of such data, 91% of users
perceive that they have lost control
over how their
personal information is collected and
used by friends, acquaintances, and
colleagues (Quinn,
2014) and especially by companies and
governments (Madden, 2014).
These two observations—the
understanding of privacy as control on
the one hand and the
experience of decreasing control over
information while using social media
on the other—can be
THE SOCIAL MEDIA PRIVACY
MODEL 4
termed a control issue of privacy. In the
remainder of this article, I will suggest
an understanding
of privacy that is adequate for social
media use and the requirements
emerging from this issue.
The Relationship of Privacy and
Control
Privacy is a concept that has been
considered and defined in very
different disciplines,
from descriptive, empirical, and
normative perspectives (Sevignani,
2016; Trepte & Reinecke,
2011). In earlier days, privacy was
considered a human right and identified
as the “right to be let
alone” (Warren & Brandeis, 1890, p.
75). Later and more specifically,
privacy was defined as
“the claim of individuals, groups, or
institutions to determine for themselves
when, how, and to
what extent information about them is
communicated to others” (Westin,
1967, p. 7) or “the
selective control of access to the self”
(Altman, 1975, p. 24).
Informational control has only seldom
been defined, but the most common
definitions
touch either a static or behavioral
aspect of control: Informational control
foremost means that
owners of a certain piece of
information have a choice over
whether, when, and to what extent
they will disclose or withhold personal
information (Crowley, 2017; Tavani,
2007). Here control
is static, a question of more or less, yes
or no. It can be understood as an option
or an available
mechanism. Then, control can be
exerted actively (e.g., restricting access
to information,
audience segregation, self-censorship,
encryption), ambiguously (e.g.,
softening the truth,
obfuscating information, or engaging
in other forms of partial disclosure), or
passively (e.g.
unintentionally omitting information)
(Crowley, 2017; Ochs & Büttner,
2018). In this rather
behavioral understanding,
informational control is executed and
experienced by the individual
person. In both perspectives, control is
centered around the individual and
individual decision
making.
The majority of privacy theories are
devoted to two—somewhat
contradictory—
paradigms: I will call the first
paradigm “privacy as control,” because
here, privacy and control
THE SOCIAL MEDIA PRIVACY
MODEL 5
are strongly connected, and, the second
paradigm “privacy and control,”
because here, privacy
and control are treated as separate
constructs with conditional
relationships. I will then suggest a
third perspective that redefines the
meaning and impact of control and the
conditions among
which control becomes relevant. This
perspective will be summarized in the
social media privacy
model.
Paradigm 1: Privacy as Control
In the seminal work by Altman (1975)
and the privacy regulation model of
self-disclosure
(Derlega, Metts, Petronio, & Margulis,
1993), control was set forth as the
crucial mechanism of
privacy. More recent
conceptualizations have also referred to
control as a precondition of privacy
(Petronio, 2002). Even in their very
first conceptualizations of privacy,
Warren and Brandeis
(1890) referred to privacy as the right
to control what others publish about
oneself. In an
overview of privacy theories, Smith,
Dinev, and Xu (2011) investigated 448
publications on
privacy. They found that—besides an
understanding of privacy as a value—
the cognate-based
understanding of privacy as control has
dominated the social sciences.
The vast majority of privacy scholars
have referred to control as a dynamic
behavior in
the process of privacy regulation to
grant access or to deny access. Altman
(1975) suggested a
process model with three steps: First,
an individual assesses the desired level
of privacy; then the
individual eventually regulates privacy
by controlling interpersonal
boundaries; and then, the
individual again assesses the achieved
level of privacy. In his flow-model the
crucial role that
was assigned to control becomes
apparent. On the basis of this notion,
Petronio (2002) articulated
how control is the engine of privacy
management. In her understanding, an
individual jointly
manages and coordinates rules with
others while interacting with them.
Here again, control is not
only the behavior through which
privacy can be gained, but control is
also the means by which to
measure the status quo of privacy, and
in turn, it will foster the extent to which
privacy regulation
THE SOCIAL MEDIA PRIVACY
MODEL 6
is further engaged in through an
exertion of control.
Privacy scholars have also referred to
the question of what is being
controlled. Here, in
particular, the control of access to
boundaries and the control of the flow
of an interaction were
addressed as the topics or processes
that needed to be controlled (Johnson,
1974; Wolfe &
Laufer, 1974). Further, control over
stimuli that impinge upon a person
were articulated as things
that need to be controlled (Wolfe &
Laufer, 1974). Margulis (1974)
explained that control refers
to all matters being exchanged between
individuals: “Privacy, as a whole or in
part, represents the
control of transactions between
person(s) and other(s)…” (p. 77).
In some theories, control has been used
almost interchangeably with privacy.
For
example, Derlega et al. (1993) stated
that “…privacy represents control over
the amount and kind
of information exchange that persons
have with one another” (p. 67). Then,
Burgoon (1982)
differentiated between four dimensions
of privacy, all of which refer to how
much control an
individual has: Possessing physical
privacy refers to whether and how
much control an individual
perceives to have over physical
boundaries. Social privacy refers to
how much control an
individual perceives to have over the
access of others to the person’s
environments.
Psychological privacy refers to how
much control an individual perceives to
have over emotional
and cognitive input and output. Finally,
informational privacy refers to how
much control an
individual perceives to have over the
use of personal data. In this
conceptualization, the ability to
exert control is the key to an
individual’s privacy perception and, in
turn, regulation. Many
empirical studies have addressed the
relationship between control and
privacy, but only a
minority of studies have supported the
notion that privacy behavior is related
to informational
control (Brandimarte, Acquisti, &
Loewenstein, 2013).
In sum, studies that have been based on
this first paradigm have underscored
the idea that
individuals exert control to achieve
privacy. In turn, privacy should be
achieved if a certain level
THE SOCIAL MEDIA PRIVACY
MODEL 7
of control is successfully executed and
maintained as the status quo. However,
these
conceptualizations of privacy suggest a
linear relationship between privacy and
control. They
assume that “…the more one has
control over this information exchange,
the greater the amount
of privacy one has in a social
relationship” (Derlega et al., 1993, p.
67). However, previous
empirical research did not find a linear
relationship between privacy and
control. Hence, there is
a mismatch between the theoretical
assumption that privacy and control are
closely related on the
one hand and the rare empirical data
supporting this notion on the other.
Paradigm 2: Privacy and Control
In social media, an individual person
cannot easily execute control because
personal
information is exchanged between
many parties and with a broad range of
applications. Users
experience social media as more
confusing, demanding, and complex
with regard to the control
that they have over their personal
information than face-to-face
communication (Marwick
& boyd, 2014; Quinn, 2014). Woo
(2016) expressed this confusion while
mimicking the
presumed thoughts of a user: “Please
do contact me and give me benefits, but
I still do not want
to fully give up my control (but I do
not know how to have that control)” (p.
954). In other
words, users want to take advantage of
the networked nature of social media,
are painfully aware
of the deficits in control, but have not
yet found solutions for how to embrace
their needs for both
gratification and informational control.
This process of weighing privacy risks
and social
gratifications has also been
investigated under the umbrella of the
privacy calculus (Trepte et al.,
2017).
To follow up on the sentiment that
users wish to have informational
control but that
control seems to contradict the
networked nature of social media,
Moor (1997) and later Tavani
(2007) reflected on the relationship
between privacy and control. They
argued that control and
privacy should be seen as separate
constructs and that privacy and control
serve very different
THE SOCIAL MEDIA PRIVACY
MODEL 8
functions. With the Restricted
Access/Limited Control (RALC) theory
of privacy, these authors
defined privacy in terms of the
individual’s protection from intrusion
and information gathering
by third parties. They argued that
control in the information age is
impossible and further that
“We can have control but no privacy,
and privacy but no control” (Tavani &
Moor, 2001, p. 6).
They suggested that privacy and control
should be separated such that privacy is
a concept and a
value that is defined by being protected
from information access by others,
whereas control is one
mechanism that can be used to manage
and justify privacy. Control may be
exerted through
choice, consent, or correction. In the
flow of the exchange of digital
information, people choose
situations according to their
communication goals, level of access,
and emerging privacy needs
(Trepte & Masur, 2017); then, privacy
is maintained through the processes of
consent, and
finally, corrections allow people to
restore their privacy when it gets lost or
threatened. For the
upcoming social media privacy model,
I will refer to this notion that control is
one mechanism
among others, and I will explain that
for all processes (i.e., choice, consent,
correction),
individuals have to get in touch with
others and communicate their motives
and aims.
With her theory of contextual integrity,
Nissenbaum (2010) also addressed the
contextual
requirements as boundary conditions,
regardless of whether control is a
functional mechanism or
not. She suggested that the two sets of
theories be married: those referring to
privacy as a
constraint on access and those referring
to privacy as a form of control. In her
theory of
contextual integrity, Nissenbaum
(2010) described control as one
“transmission principle” (p.
145) that defines how information is
exchanged. Other transmission
principles are reciprocity and
confidentiality. Control as a
transmission principle is appropriate
only if it fits into the particular
context, the subject that users are
talking about, the type of information
that is to be exchanged,
and the actors they communicate with.
From this point of view, there can be
privacy without
control in situations in which control is
inappropriate or not available (Laufer
& Wolfe, 1977;
THE SOCIAL MEDIA PRIVACY
MODEL 9
Slater, 2007).
Current privacy theories pushed the
idea of control as a dynamic attribute of
the situation
one crucial step further. According to
Dienlin’s (2014) privacy process
model, individuals assess
the controllability of the context and
behavior. Masur (2019) adds an
analysis of what is being
controlled by entangling interpersonal
(e.g., the relationship between
interaction partners) and
external factors (e.g., the architecture of
a room) in his theory of situational
privacy and self-
disclosure. Depending on the situation,
these interpersonal and external factors
can be controlled
to different degrees, and in turn, they
can elicit differential levels of self-
disclosure. Self-
disclosure will be understood as “the
intentional communication of
information about the self to
another person or group of people”
(Masur 2019, p. 70) in the remainder of
this article.
The notion that privacy and control are
not necessarily connected has been
supported by
previous research (Saeri, Ogilvie, La
Macchia, Smith, & Louis, 2014). For
example, Zlatolas,
Welzer, Heričko, and Hölbl (2015)
demonstrated that privacy norms,
policies, and awareness but
not privacy control were related to the
self-disclosures of N = 673 Slovenian
Facebook users. In
a U.S. sample of N = 249 Facebook
users, Taneja, Vitrano, and Gengo
(2014) found that
perceived behavioral control and the
intention to engage in privacy-related
behavior were
unrelated. Eastin et al. (2016)
investigated how different variables
predicted mobile commerce
activity and found that control was the
one that explained the smallest amount
of variance. In
particular, trust and attitude toward
mobile commerce were more important
predictors than
control. In sum, individuals who
perceived that they had control over
their personal data did not
necessarily feel they had more privacy
and did not increasingly engage in self-
disclosure. Further,
trust and norms were identified as
important alternative mechanisms of
privacy (Brandimarte et
al., 2013; Eastin et al., 2016;
Nissenbaum, 2010; Zlatolas et al.,
2015). I will refer to both
findings in the social media privacy
model.
THE SOCIAL MEDIA PRIVACY
MODEL 10
The Interplay of Affordances, Control,
and Privacy
The lack of a relation between privacy
and control might hint that the interplay
of the two
variables is not linear and is actually
more complex (Laufer & Wolfe, 1977).
The relation
between control and privacy should
become clearer if the social media
boundary conditions that
make control a functional mechanism
in one situation but impractical in
another are elucidated.
Social Media and its Boundary
Conditions for Privacy
Carr and Hayes (2015) defined social
media as “…Internet-based channels
that allow
users to opportunistically interact and
selectively self-present, either in real-
time or
asynchronously, with both broad and
narrow audiences who derive value
from user-generated
content and the perception of
interaction with others” (p. 50). They
further pointed out that users’
interaction will increasingly be
influenced by social media affordances.
Further, social media has
been characterized by its content, its
users, and its infrastructure in previous
definitions (Howard
& Parks, 2012). The most prominent
examples of social media are social
network sites (e.g.,
Facebook, Instagram, LinkedIn,
Google+), multimedia platforms (e.g.,
Youtube, Slideshare,
Soundcloud), weblogs (e.g., personal
diaries of mothers, scholars, or self-
appointed or paid
influencers), and microblogs (e.g.,
Twitter). In struggling to develop a
definition of social media,
scholars have pointed to the fact that
social media channels are formally
understood as methods
of mass communication but that they
primarily contain and perpetuate
personal user interactions
(Carr & Hayes, 2015; Papacharissi,
2010). In this sense, social media can
be referred to as
personal publics (Schmidt, 2014). As a
consequence, users cannot always
clearly define the
somewhat blurred lines between
personal and public or between private
and professional
communication. They feel that contexts
collapse and converge (Papacharissi,
2010; Vitak, 2012).
In sum, social media is characterized
by the following boundary conditions:
the content, its flow
and further uses (Howard & Parks,
2012); the communication practices
that users perceive as
THE SOCIAL MEDIA PRIVACY
MODEL 11
their options for exerting control or for
achieving privacy with other means;
and social media
affordances (Carr & Hayes, 2015). In
the following, I will analyze of how the
interplay of these
boundary conditions is related to
control and how it determines different
privacy perceptions and
behaviors. Tables 1 and 2 in the
Supplemental summarize this
theoretical development.
Social Media Boundary Condition 1:
Content, its Flow and Uses
What exactly does an individual strive
to control? The sooner we come to
understand
what individuals strive to control, the
better we can evaluate whether control
can be experienced
in social media. According to most
studies, personal information refers to
the content that people
strive to control in order to maintain
their privacy in social media. Metzger
(2004) referred to
personal information as the content to
be controlled. Quinn (2014) suggested
different layers of
how privacy can be maintained. On the
“content layer,” users’ experience of a
lack of control
leads them to limit the information they
post or even to post false information.
Sarikakis and
Winter (2017) added on the basis of
their qualitative work that users do not
differentiate between
personal information and personal data.
Instead, they define the degree of
intimacy or privacy
needed for a certain piece of
information or data.
Then, besides personal information, the
flow and use of the content needs to be
considered. Social media advocates
specifically address where online
information is forwarded,
archived, and sold. They emphasize
users’ concerns about how little control
they have over the
flow and use of personal information
(Marwick & boyd, 2014; Quinn, 2014;
Tsay-Vogel,
Shanahan, & Signorielli, 2018). This
refers to the forms personal
information takes, to where it
ends up and how it is used. In the
following, personal information, its
flow, and further uses will
be considered as what users strive to
control.
Social Media Boundary Condition 2:
Practices of Control
Actively exerting control expresses
informational self-determination,
which implies
THE SOCIAL MEDIA PRIVACY
MODEL 12
having power over information and
agency in decisions regarding this
information. In turn, a loss
of control would mean that other
behavioral options are out of reach and
that individuality (Buss,
2001), power, and agency are severely
threatened (Brummett & Steuber,
2015). Control also
comes along with risk avoidance:
Users have identified the most
important pieces of information
that they want to control as the contents
of their emails, the contents of their
online chats, and
their location (Cho, Lee, & Chung,
2010). As long as they have control
over this information,
they can avoid being harassed, bullied,
financially exploited by companies, or
surveilled by
governmental institutions.
How is control executed and achieved?
First, informational control can be
identified as an
individual’s perception that he or she
has a choice about whether to withhold
or disclose
information (Crowley, 2017; Johnson,
1974). Choice is the first step and
determines whether
control can be exerted and to what
degree (Wolfe & Laufer, 1974). Then,
in the next step, when
choice is available, it has to be put into
action. Two active practices of control
in social media are
consent and correction (Tavani, 2007).
Consent refers to the extent to which
users agree that a
certain piece of information will be
passed along. Correction means that
users are able to
withdraw from this agreement.
Whereas choice was identified long
ago as a practice of control
(Johnson, 1974), consent and correction
were suggested as social media
practices (Tavani, 2007).
Further, informational control can also
be put into practice by the selective
sharing of
information, self-censorship, audience
segregation, and encryption (Ochs &
Büttner, 2018). All
of these options are directed by the
individual and can be considered to be
ego-centered. It will be
important to also find terms for
interpersonal privacy regulation
behaviors.
Social Media Boundary Condition 3:
Affordances
Social media can be characterized by
affordances. The term affordance,
initially
suggested by Gibson (1979/2014),
means that the environmental
characteristics of a certain entity
THE SOCIAL MEDIA PRIVACY
MODEL 13
are not static but differently perceived,
experienced, and as such shaped by
humans. In the case of
social media, this understanding is
more than suitable. Of course the
process of shaping or
“furnishing” (Gibson, 1979/2014, p.
78) social media influences its further
uses. For example,
teenage users regulate their privacy
through social steganography, an
idiomatic language that is
understood only by their peers but not
their parents instead of managing their
audiences by
editing their friends lists and
systematically blocking their parents or
certain peers (boyd, 2014).
Inventing and using this kind of
idiomatic language might influence
users’ style of
communication and interactions on
social media. A selection of four
affordances have repeatedly
been shown to be particularly important
for social media: anonymity,
editability, association, and
persistence (boyd, 2008; Evans, Pearce,
Vitak, & Treem, 2017; Treem &
Leonardi, 2012). The
affordances of anonymity and
editability allow users to exert control.
By contrast, the affordances
of association and persistence
challenge users’ ability to exert
control. Both clusters of
affordances—those enhancing (Table
1) as well as those challenging control
(Table 2)—have
different implications for how content
is controlled, which practices of
control are available, and
how they affect privacy regulation in
social media realms. In the following, I
intertwine the
results from previous research on
privacy, the content and practices of
control, and, affordances.
Anonymity. The affordance of
anonymity describes the idea that other
social media
agents such as other private people,
institutions, or companies do not know
the source of the
message or the sender (Evans et al.,
2017). For social media use, being
unknown to others and
anonymously using social media is rare
(Rainie, Kiesler, Kang, & Madden,
2013). Nevertheless,
anonymity is still appreciated
occasionally. For example, users
commenting on other users’ posts
in self-help or political forums can
decide to keep their interactions
anonymous. In addition, users
of dating websites might at least partly
or temporarily use such sites
anonymously (Ramirez,
Bryant, Erin, Fleuriet, & Cole, 2015).
Anonymity is not a question of “on” or
“off” but is flexible
THE SOCIAL MEDIA PRIVACY
MODEL 14
and scalable (Evans et al., 2017).
Anonymity has spurred enormous
attention in research on
computer-mediated communication
(CMC; Joinson, 2001). Here,
anonymity is specifically
considered a means to control the
access of certain individuals (Qian &
Scott, 2007). Further,
while being online anonymously, an
individual can deliberately decide what
information to share,
with whom to share it, and what to
withhold (Qian & Scott, 2007). While
being online
anonymously, the receiver of a
message cannot provide the CMC with
face-to-face behaviors,
and as such, the control lies in the
hands of the sender (Ben-Ze'ev, 2003).
Control over content, its flow, and its
uses is possible because, in a state of
anonymity, all
of these are disconnected from the user.
Although full anonymity is not afforded
by social media,
U.S. American users occasionally keep
their posts anonymous while using
social media with the
clear aim of exerting control (Rainie et
al., 2013). For example, they disguise
their location or
delete cookies so that companies
cannot identify them. An interview
partner in Sarikakis and
Winters’ (2017) study said: “Well
when I use fake names or email
addresses and fake birthdates I
think that’s the only control you can
try to have” (p. 9). Two aspects,
though, might mitigate the
perception of control. First, users know
that they leave traces behind, and once
they have posted
content online—even if it was posted
anonymously—they might be traceable
because of other
information they left online; second,
anonymity is usually used only to some
extent (e.g., by
leaving one’s name but not one’s
address), and users acknowledge that
with partial anonymity,
they experience only partial control,
and in turn, only partial privacy (Rainie
et al., 2013). What
are the available practices users have to
exert control? First, being online
anonymously can be
considered a question of choice. Woo
(2016) suggested that anonymity—or
by contrast,
identifiability—is the most important
issue in online privacy. He argued that
on the social web,
users should have “islands” of
anonymity. He has even encouraged
people to lie and to have
secrets with the aim of regaining
control and autonomy. Then, however,
in an anonymous setting,
THE SOCIAL MEDIA PRIVACY
MODEL 15
consent and corrections are somewhat
disconnected from the user as these are
identity-related
practices.
Anonymity can be an enactment of
control by disguising or lying about
one’s identity or
by leaving it unspecified for some
applications and occasions. Also,
people may leave the source
of their own messages unknown. And,
of course, these enactments of control
might be applied
when interacting with some users but
not with others. In sum, the affordance
of anonymity is
related to informational control, which
has also been demonstrated in
empirical studies (Fox &
McEwan, 2017).
In previous research on privacy,
anonymity has played a crucial role.
Here, it was even
understood as a “type” of privacy
among other types such as solitude,
intimacy, or reserve
(Pedersen, 1999; Westin, 1967). In
sum, anonymity allows users to exert
control and, in turn, it
increases an individual’s subjective
experience of privacy by not being
identifiable at all or by
selectively presenting parts of one’s
own identity (Smith et al., 2011; Woo,
2016).
Editability. While using social media,
users interact remotely in terms of time
and space.
This gives them the chance to edit their
online communications before and
after the
communications are seen by others
(Treem & Leonardi, 2012). Editability
is an affordance that
was previously addressed as important
for CMC in the hyperpersonal model
(Walther, 1996):
Senders of online messages self-
selectively present themselves online
by primarily transmitting
cues and messages that they want
others to get to know and that put them
in a positive light. In
addition, although editing one’s identity
is part of any social interaction, social
media platforms
offer more freedom in terms of what,
how, and when these interactions are
edited. Editing allows
users to rehearse, change, package, or
literally craft a message and, in turn, to
rehearse, change,
and craft their personal appearance.
Editability allows the message sender
control over content and its flow and
uses because
THE SOCIAL MEDIA PRIVACY
MODEL 16
users have the chance to ponder the
consequences of their posts (Treem &
Leonardi, 2012).
Further, users may control the flow and
further use of their content by
articulating the lists of
online friends that they connect with or
by using a private or public profile
(Ellison & boyd,
2013). The availability of control
practices is highly supported by social
media’s affordance of
editability (Fox & McEwan, 2017).
Users have a choice to either intuitively
post their thoughts or
to pictorially represent their nonverbal
cues. Editing can be considered an
active enactment of
control because users deliberately
choose what to reveal to certain
audiences or what to withhold
(Crowley, 2017). Further, corrections
of one’s own posts and decisions are
possible and can also
be conceived as an enactment of
control (Crowley, 2017).
Control over the flow of information
might also foster subjective
experiences of privacy. In
privacy research, exerting control over
the flow of an interaction was often
understood
synonymously with control or as a
transmission principle that guaranteed
privacy (Nissenbaum,
2010).
Association. Social media platforms are
primarily used because they offer users
the chance
to connect with others and to stay in
touch. Users have articulated the idea
that communication is
their most important motive for using
social network sites (Quinn, 2016).
Consequently, the most
important affordance of social media is
the associations that are created or
maintained between
interaction partners (Ellison & boyd,
2013; Treem & Leonardi, 2012).
The affordance of association and the
chance to exert control over content, its
flow and
uses seem strikingly incompatible. In
their cross-sectional study of N = 875
Mechanical Turk
workers, Fox and McEwan (2017)
demonstrated that informational
control and network
association were negatively related.
Control is exerted by an individual
person, and as such, the
individual person is at the center if not
entirely responsible for achieving
privacy via control. This
is clearly expressed in users’ current
understanding of control. For example,
participants in
THE SOCIAL MEDIA PRIVACY
MODEL 17
Sarikakis and Winters’ (2017) study
identified the individual as the
legitimate “controller” of
privacy (p. 6). Also, boyd (2014)
argued that exerting control with the
aim of achieving privacy
requires the individual’s full
consideration, power, knowledge, and
skills. The only control
practice available would be to
completely withdraw (i.e., not to
participate) and to accept the
disadvantages that come along with
such a decision. Then, ambiguous and
passive enactments of
control might be used, but these have
the same disadvantages.
In sum, control as an issue and a
concept takes the perspective of the
individual. In other
words, it is an “ego-centered need”
(Papacharissi, 2010, p. 144). By
contrast, association is an
interindividual concept. Very likely,
one person’s control ends at the point
where another
person’s control starts. Hence, as much
as the social web is an interdependent
realm involving
other parties, control is not the means
to ensure or exert privacy. On the basis
of these
considerations—and this will be
important for the upcoming
propositions on social media
privacy—other means and mechanisms
to guarantee the subjective experience
of privacy have
become necessary: Users communicate
with each other to ensure privacy. And
further, if
communication is not possible, they
choose communication partners—
individuals, organizations,
institutions—that they can trust. Trust
has been shown to be crucial for
ensuring the perception of
privacy and subsequent online
disclosure (Metzger, 2004). Trust can
be established by personal
communication (Petronio, 2002) and by
norms that the individual user can rely
on (Saeri et al.,
2014). As such, in the upcoming
propositions and the social media
privacy model, trust,
communication, and norms are
conceptualized as the core mechanisms
to ensure privacy beyond
control.
Persistence. The affordance of
persistence addresses the durability of
online expressions
and content (boyd, 2014) and the idea
that after personal information is
published online, it is
automatically recorded and archived
and is consequently replicable (boyd,
2008). It means that
THE SOCIAL MEDIA PRIVACY
MODEL 18
data remain accessible in the same
form over long periods of time and for
diverse and unforeseen
audiences (Evans et al., 2017; Treem &
Leonardi, 2012). Some authors have
emphasized the
positive outcomes that persistence may
have, namely, that it allows knowledge
to be sustained,
creates robust forms of
communication, establishes the growth
of content (Treem & Leonardi,
2012), and allows for authenticity and
longevity (boyd, 2008).
However, as persistence comprises the
endless and infinite nature of online
data, it also
expresses a loss of informational self-
determination. Persistence seems
incompatible with
control. It evokes the idea that control
over content, its flow, and uses is
impossible because once
personal information is posted online,
it is no longer under the sender’s
control. With regard to
control practices, social media users
have the choice to completely
withdraw from online
interactions, to not post their
information, and thus to avoid its
persistence. At the same time, this
would prevent them from receiving the
myriad benefits that come along with
data sharing and
thus does not seem to be a question of
freedom of choice anymore. Also, as
soon as the choice is
made to participate, other control
practices such as consent and correction
significantly decrease.
Finally, once given, consent decreases
a person’s chances to subsequently
correct previous online
communication. Users know that they
do not have control over how persistent
their data will be,
and this significantly decreases their
subjective experience of privacy
(Rainie et al., 2013). The
lack of control over personal
information due to the persistence of
all kinds of online information
can be considered one of the key issues
of online lives and the subjective
experience of privacy
today. To react to users’ needs to
foresee and understand persistence
(Rainie et al., 2013),
communication, trust and norms seem
to be adequate ways to ensure privacy.
The Social Media Privacy Model
Social media privacy is based on
interpersonal processes of mutual
disclosure and
communication (Altman, 1975;
Petronio, 2002). Further, it can be
considered a value that is co-
THE SOCIAL MEDIA PRIVACY
MODEL 19
developed by engaging in
communication and that is expressed
by a shared perception
(Nissenbaum, 2010; Smith et al.,
2011). On the basis of these
considerations, I propose:
Proposition 1: Privacy is
interdependently perceived and valued.
In contrast to privacy, control is at the
center of the individual person. Control
is exerted
by the individual person, and it can be
considered to be ego-centered
(Papacharissi, 2010;
Sarikakis & Winter, 2017). Social
media platforms aim for
connectedness, interdependence, and
sociality and can be described as
social-centered (Ellison & boyd, 2013).
As a consequence,
social media privacy cannot be
sufficiently achieved by exerting
control. Other mechanisms are
necessary to ensure social media
privacy.
Proposition 2: Privacy cannot be
satisfactorily achieved by exerting
control in social media.
Instead of striving for control as an
end-game of privacy, the opposite is
necessary to
experience privacy in social media.
Users need to constantly communicate
with each other as
well as with institutions and companies
to ensure their privacy. They need to
engage in both
interpersonal and deliberative
communication processes.
Interpersonal communication is
understood as interactions between
users as well as interactions between
the individual user and
others who represent third parties such
as institutions and companies.
Deliberation is defined as
either informal or institutionalized
interaction among internet users (and
eventually
representatives of governments,
institutions, or companies), involving
rational-critical decision
making and the earnest aim to find a
solution (Burkhalter, Gastil, &
Kelshaw, 2002).
Proposition 3: Interpersonal
communication is a mechanism by
which social media privacy can
interdependently ensured and put into
effect.
However, not all actions and steps of
online media use can be accompanied
by
communication processes. For many if
not most questions of privacy, people
can rely on past
experiences. Here, communication and
deliberation crystallize into a stable
relationship-based
THE SOCIAL MEDIA PRIVACY
MODEL 20
result or even solution, i.e. trust (or
mistrust) and norms (or anomia). Trust
can be defined as an
anticipation and expectation of how a
person or institution will behave and as
such reduces
uncertainty (Waldman, 2018, p. 4).
Trust has been shown to ensure privacy
on the basis of
longstanding communication and
reliable bonds (Saeri et al., 2014). Trust
can be conceived as
both a crucial factor of influence in
decisions over self-disclosure and as a
result of
communication (Saeri et al., 2014). In
turn, mistrust—which has not yet been
addressed in
privacy research—is a menace to
privacy and as such should initiate
communication. In a
qualitative study on privacy
perceptions, Teutsch, Masur, and
Trepte (2018) demonstrated that
participants perceived that they had lost
control and had substituted trust for
control. One of the
interview partners said, “Well, privacy
is absolute trust between conversational
partners and
absolute, absolute certainty that the
subject of conversation will stay within
this sphere” (p. 7).
Eichenhofer (2019) suggested that the
“trust paradigm” should point toward a
more current
perspective on privacy regulation via
trust in contrast to privacy regulation
via control or self-
determination.
In the case of privacy regulation, both
social and legislative norms come into
play (Gusy,
2018; Spottswood & Hancock, 2017;
Utz & Krämer, 2009). Social norms are
understood as
social pressure to engage in a certain
kind of behavior and are established by
what others approve
of (injunctive norms) and what they
actually do (descriptive norms) (Saeri
et al., 2014). Although
legislative norms are coined by
jurisprudence, regulated by law (and
not on the basis of observing
others), they share with social norms in
that they prescribe a certain behavior
and that they allow
for sanctions in case this behavior is
not shown. Previous research has
shown that trust, and
norms are the keys to obtaining privacy
(Marwick & boyd, 2014; Quinn, 2014).
To establish trust
and norms, of course, communication
is necessary.
Proposition 4: Trust and norms
function as privacy mechanisms that
represent
THE SOCIAL MEDIA PRIVACY
MODEL 21
crystallized privacy communication.
I suggest that control and
communication have found a new
balance in social media
communication: Control is losing
control and communication is gaining
power. In other words,
users do not solely rely on and strive
for control in the first place but strive to
communicate about
privacy to establish norms and trust
and even sometimes to regain control.
In fact, privacy’s
interdependence is expressed and put
forward by interpersonal
communication. This emerging
social turn in privacy theory is also
acknowledged in the definition of
privacy.
I define privacy by an individual’s assessments
of (a) the level of access to this
person in an interaction or relationship with
others (people, companies,
institutions) and (b) the availability of the
mechanisms of control,
interpersonal communication, trust, and norms
for shaping this level of access
through (c) self-disclosure as (almost intuitive)
behavioral privacy regulation
and (d) control, interpersonal communication,
and deliberation as means for
ensuring (a somewhat more elaborated)
regulation of privacy. In social media,
then, the availability of the mechanisms that
can be applied to ensure privacy
are crucially influenced by the content that is
being shared and the social media
affordances that determine how this content is
further used.
In the following, I will summarize the
theoretical rationale developed in this
article in the
chronology of a communicative
process. Further, I will show how the
individual’s privacy
assessments referred to in the four
propositions eventually lead to different
forms of privacy
regulation behaviors. The process is
illustrated in Figure 1. The following
steps are meant to
make the model accessible for
empirical investigation.
The first part of the flowchart refers to
the social media user’s subjective and
initial
assessment: All humans have
individual levels of access they
perceive as being more or less
adequate and comfortable. This
dispositional level of access is a
quantifiable dimension varying
THE SOCIAL MEDIA PRIVACY
MODEL 22
between high and low levels and
expressing the individual’s
dispositional willingness to self-
disclose. In contrast, the
communication goal is rooted in the
situation and defines what is to be
achieved in this particular situation.
The individual’s communication goals
in social media are
manifold and important to consider
when assessing online privacy. They
can most likely be
understood as a qualitative scenario of
what the individual user wants and
needs to communicate
in a certain situation. Hence, the point
of departure for each and any
consideration about privacy
are the more or less consciously asked
questions: How do I feel, what do I
need, and what is my
goal for this particular situation?
The second part of the model refers to
the social media boundary conditions
that are
encountered. Here, content and
affordances dynamically interact (as
indicated with the
multiplication sign) with an
individual’s initial assessment. The
individual weighs the ideal level
of access and his/her communication
goals against social media boundary
conditions by
considering what content is shared,
where; how it might flow from one user
or institution to
another; and, how it might be used.
Social media content becomes dynamic
as it is displayed and
shared. Affordances represent this
dynamic and, together with the
individual’s dispositions and
goals, shape the available privacy
mechanisms: control, trust, norms, and
interpersonal
communication. Hence, users assess
whether they have a choice, whether
they can rely on trust or
norms, or whether they will (have to)
engage in interpersonal
communication.
The third part of the model refers to the
subjective experience of privacy. The
individual
experiences a certain level of access
that results from the individual’s goals
on the one hand and
the social media boundary conditions
and privacy mechanisms that are
applied to actively
regulate privacy on the other. This
experience is here understood as the
rather unfiltered
accumulation of external stimuli and
internal needs and, then, results in a
more elaborated re-
assessment, i.e. the privacy perception
that can be verbalized, and is
empirically accessible.
THE SOCIAL MEDIA PRIVACY
MODEL 23
The privacy perception results in
different forms of privacy regulation
behaviors. First,
self-disclosure belongs to the most
intuitive regulation behaviors and
includes all information
intentionally shared (or not shared)
with others. And, for the case that the
privacy mechanism of
control is available and considered
adequate for a certain communication
goal, users exert control
actively and intentionally by restricting
access to information, audience
segregation, self-
censorship, encryption; or, rather
ambiguously by softening the truth,
obfuscating information.
When the privacy mechanisms do not
allow the deliberate and somewhat
egocentric privacy
regulation (i.e. when individuals do not
have at least partial control), other
regulation behaviors
come into play. The individual might
engage in interpersonal communication
or even deliberation
to negotiate and interdependently
regulate privacy. Interpersonal
communication, deliberation,
and control are meta-level regulation
behaviors that come into play when
privacy behaviors are
not intuitive, when contexts collapse, or
when a certain situation demands
further elaboration
and/or communication.
The communication process shown in
Figure 1 will take turns in a constant
flow of
assessments and re-assessments as soon
as the reactions of others lead to
changes in conditions or
when personal goals or needs change.
In what follows, I will discuss the
capabilities and
consequences of the model’s theoretical
propositions. What are the possible
effects and what are
the pitfalls and opportunities that will
occur if communication, trust, and
norms are substituted
for control?
Challenging the Social Media Privacy
Model
Social media use is at the heart of
human communication and offers all of
its many merits
such as practicing freedom of speech or
reaching out to people in critical living
conditions and
providing them with social support. In
addition, social media communication
offers particular
benefits because it is ubiquitous and
independent of time and location. As
such, for example, it
THE SOCIAL MEDIA PRIVACY
MODEL 24
allows people to communicate across
borders into the lands of friends, foes,
and fiends. All of
these merits of online communication
give its users enormous freedom. This
freedom is—and
this is again one of the major merits of
social media communication—often
independent of the
domestic or cultural situation of the
user. Freedom is historically closely
linked to privacy. In
mediaeval times, only those who had
land were free (Moore, 1984). And
only those who
possessed land had the freedom to
withdraw, grant, or restrict access to
their land. In turn, the
“unfree,” who did not have their own
land or possessions, did not have the
right to privacy. In
this sense, freedom is the origin both in
legislation and the genealogy of privacy
(Gusy, 2018).
For social media and online realms,
freedom and privacy are closely
connected. However,
the ideas of possessions and ownership
do not seem fully applicable anymore.
Borders and
possessions have become fuzzy
because, very often, data and personal
information are perceived
as shared goods as soon as they appear
online. Nissenbaum (2010)
summarized her work on
contextual integrity with the words:
“We have a right to privacy, but it is
neither a right to control
personal information nor a right to
have access to this information
restricted” (p. 231). She
conceded that for social media, privacy
is rather a value and a perception.
Borders and possessions have a
permanent nature. By contrast, values
and perception are
subject to interpretation and change.
This understanding of privacy as
subject to interpretation
and change makes it a matter of
communication. Eventually,
communication about privacy will
result in trust, and if successfully
shared in a society, in social and
legislative norms. However,
communication cannot be understood
as a long and painful process that will
finally show the
solution and lead us into the light of
privacy. Just the opposite is the case. In
social media,
communication about data sharing and
the use and flow of interpersonal data
is the solution itself.
Only due to ongoing and critical
assessment, reassessment, and dynamic
communication will we
have the chance to ensure privacy as
one of the most important values of
civilized societies.
THE SOCIAL MEDIA PRIVACY
MODEL 25
Privacy’s interdependence is expressed
and put forward by interpersonal
communication.
And in fact, privacy is experiencing a
“social turn”, in social media and
beyond (Helm &
Eichenhofer, 2019). This emerging
social turn in privacy theory is
acknowledged in the social
media privacy model. However, there
are downsides to a conception of
privacy as a
communicative process. First, not all
members of this communicative
process will have equal
chances of being heard. For example,
digital and literacy gaps have been
shown to crucially
influence participation in these
communication processes (Helsper,
2017).
Second, online information has become
a commodified good, and financial
interests are
stark (Sevignani, 2016). Purposeless
communication among friends is
increasingly exploited for
economic reasons (Seubert & Becker,
2019); and, companies do their best to
avoid interactions
between individual users who strive to
regulate their privacy. In turn, they try
to establish trust
through strong branding activities that
have been shown to override social
media users’ privacy
concerns and their interest in solving
privacy issues by actively
communicating and participating
(Boerman, Kruikemeier, & Zuiderveen
Borgesius, 2018; Li, 2011).
Third, as a consequence, the
requirements of communication and
trust demand a lot from
users. Control implies a settled state in
which the individual person can lie
back and stop thinking
about the flow of personal information
online. Communication, trust, and
norms, by contrast, are
subject to change and thus require
constant assessment and consideration.
Hence, information
control should also be considered with
regard to an individual’s ability to
exert control (Grimm &
Bräunlich, 2015). The users-centered
perspective needs to be complemented
and accompanied
with a system based perspective and
respective interventions (Schäwel,
2020).
Fourth, this demanding process of
communication might also result in a
threat to the
ability to develop a self-determined
identity. Westin (1967) held that
boundary control means
identity control: “This deliberate
penetration of the individual’s
protective shell, his
THE SOCIAL MEDIA PRIVACY
MODEL 26
psychological armor, would leave him
naked to ridicule and shame and would
put him under the
control of those who knew his secrets”
(p. 33). As a consequence, lost control
would mean
threats to identity development.
Finally, I embrace the demanding and
somewhat stressful nature of
communicating about
privacy in social media. In social
media, there is only limited control
over personal information.
In addition, the handling of this lack of
control is demanding and stressful. By
acknowledging
these two observations, users will
acknowledge that they need to take
action, engage in
communication, and establish trust and
shared social and legislative norms. In
social media,
privacy is not a private affair. It is at
the center of communication. We are
out of control because
we have so much to share. Hence,
interpersonal communication, trust,
and norms are the three
most important mechanisms that
interdependently help to ensure social
media privacy.
THE SOCIAL MEDIA PRIVACY
MODEL 27
References
Altman, I. (1975). The environment
and social behavior: Privacy, personal
space, territory,
crowding. Monterey, CA:
Brooks/Cole Publishing Company.
Anderson, J., Rainie, L., & Duggan,
M. (2014). Digital life in 2025.
Retrieved from
http://www.pewinternet.org/
2014/03/11/digital-life-in-2025/
Ben-Ze'ev, A. (2003). Privacy,
emotional closeness, and openness in
cyberspace. Computers
in Human Behavior, 19(4), 451–567.
https://doi.org/10.1016/S0747-
5632(02)00078-X
Boerman, S. C., Kruikemeier, S., &
Zuiderveen Borgesius, F. J. (2018).
Exploring
motivations for online privacy
protection behavior: Insights from
panel data.
Communication Research, 25.
https://doi.org/10.1177/009365021880
0915
Boyd, d. (2008). Taken out of context.
American teen sociality in networked
publics (Doctoral
dissertation). University of California,
Berkeley.
Boyd, d. (2014). It's complicated. The
social lives of networked teens. New
Haven, CT: Yale
University Press.
Brandimarte, L., Acquisti, A., &
Loewenstein, G. (2013). Misplaced
confidences: Privacy and
the control paradox. Social
psychological and personality science,
4(3), 340–347.
https://doi.org/
10.1177/1948550612455931
Brummett, E. A., & Steuber, K. R.
(2015). To reveal or conceal? Privacy
management
processes among interracial romantic
partners. Western Journal of
Communication, 79(1),
22–44.
https://doi.org/10.1080/10570314.201
4.943417
Burgoon, J. K. (1982). Privacy and
communication. Communication
Yearbook, 6(4), 206–
249. https://doi.org/10.1080/23808985
Burkhalter, S., Gastil, J., & Kelshaw,
T. (2002). A conceptual definition and
theoretical model
of public deliberation in small face-to-
face groups. Communication Theory,
12(4), 398–
422.
https://doi.org/10.1093/ct/12.4.398
Buss, A. (2001). Psychological
dimensions of the self. Thousand
Oaks, CA: Sage.
THE SOCIAL MEDIA PRIVACY
MODEL 28
Carr, C. T., & Hayes, R. A. (2015).
Social media: defining, developing,
and divining. Atlantic
Journal of Communication, 23(1), 46–
65.
https://doi.org/10.1080/15456870.201
5.972282
Cho, H., Lee, J.-S., & Chung, S.
(2010). Optimistic bias about online
privacy risks: Testing
the moderating effects of perceived
controllability and prior experience.
Computers in
Human Behavior, 26, 987–995.
https://doi.org/10.1016/j.chb.2010.02.
012
Crowley, J. L. (2017). A framework
of relational information control: A
review and extension
of information control research in
interpersonal contexts.
Communication Theory, 27(2),
202–222.
https://doi.org/10.1111/comt.12115
Derlega, V. J., Metts, S., Petronio, S.,
& Margulis, S. T. (1993). Self-
disclosure. Sage series
on close relationships. Newbury Park,
CA: Sage Publications.
Dienlin, T. (2014). The privacy
process model. In S. Garnett, S. Halft,
M. Herz, & J. M.
Mönig (Eds.), Medien und Privatheit
[Media and privacy] (pp. 105–122).
Passau,
Germany: Karl Stutz.
Eastin, M. S., Brinson, N. H., Doorey,
A., & Wilcox, G. (2016). Living in a
big data world:
predicting mobile commerce activity
through privacy concerns. Computers
in Human
Behavior, 58, 214–220.
https://doi.org/10.1016/j.chb.2015.12.
050
Eichenhofer, J. (2019). e-Privacy -
Theorie und Dogmatik eines
europäischen
Privatheitsschutzes im Internet-
Zeitalter [Theoretical and doctrinal
foundations of a
European privacy protection
regulation in the internet age].
Bielefeld: University of
Bielefeld.
Ellison, N. B., & boyd, d. (2013).
Sociality through social network sites.
In W. H. Dutton
(Ed.), The Oxford handbook of
Internet studies (pp. 151–172).
Oxford, UK: Oxford
University Press.
European Commission. (2015).
Special Eurobarometer 431: Data
protection. Brussels, BE.
Retrieved from
http://ec.europa.eu/public_opinion/arc
hives/ebs/ebs_431_en.pdf
THE SOCIAL MEDIA PRIVACY
MODEL 29
Evans, S. K., Pearce, K. E., Vitak, J.,
& Treem, J. W. (2017). Explicating
affordances: A
conceptual framework for
understanding affordances in
communication research. Journal
of Computer-Mediated
Communication, 22(1), 35–52.
https://doi.org/10.1111/jcc4.12180
Fox, J., & McEwan, B. (2017).
Distinguishing technologies for social
interaction: The
perceived social affordances of
communication channels scale.
Communication
Monographs, 84(3), 298–318.
https://doi.org/10.1080/03637751.201
7.1332418
Gibson, J. J. (2014). The ecological
approach to visual perception.
Psychology Press &
Routledge Classic Editions. Hoboken,
NJ: Taylor and Francis (Original work
published
1979).
Grimm, R., & Bräunlich, K. (2015).
Vertrauen und Privatheit [Trust and
privacy]. DuD
Datenschutz und Datensicherheit
[Data protection and data security], 5,
289–294.
Gusy, C. (2018). Datenschutz als
Privatheitsschutz oder Datenschutz
statt Privatheitsschutz?
[Data protection as privacy protection
or privacy protection as data
protection?].
Europäische Grundrechte Zeitschrift
[European Fundamental Rights
Journal], 45(9-12),
244–255.
Helm, P., & Eichenhofer, C. (2019).
Reflexionen zu einem social turn in
den privacy studies.
In C. Aldenhoff, L. Edeler, Hennig,
M., Kelsch, J., L. Raabe, & F. Sobala
(Eds.),
Digitalität und Privatheit [Digitality
and Privacy] (pp. 139–166). Bielefeld,
Germany:
transcript.
https://doi.org/10.14361/97838394466
14-009
Helsper, E. J. (2017). The social
relativity of digital exclusion:
Applying relative deprivation
theory to digital inequalities.
Communication Theory, 27(3), 223–
242.
https://doi.org/10.1111/comt.12110
Howard, P. N., & Parks, M. R. (2012).
Social media and political change:
Capacity,
constraint, and consequence. Journal
of Communication, 62(2), 359–362.
https://doi.org/10.1111/j.1460-
2466.2012.01626.x
THE SOCIAL MEDIA PRIVACY
MODEL 30
Johnson, C. A. (1974). Privacy as
personal control. In S. T. Margulis
(Ed.), Man-environment
interactions: Evaluations and
applications (pp. 83–100).
Stroudsburg, PA: Dowden,
Hutchinson & Ross.
Joinson, A. N. (2001). Self-disclosure
in computer-mediated
communication: The role of self-
awareness and visual anonymity.
European Journal of Social
Psychology, 31(2), 177–192.
https://doi.org/10.1002/ejsp.36
Laufer, R. S. [R. S.], & Wolfe, M.
(1977). Privacy as a concept and a
social issue: A
multidimensional developmental
theory. Journal of Social Issues, 33(3),
22–42.
https://doi.org/10.1111/j.1540-
4560.1977.tb01880.x
Li, Y. (2011). Empirical studies on
online information privacy concerns:
Literature review
and an integrative framework.
Communications of the Association
for Information Systems,
28(1), 453–496. Retrieved from
http://aisel.aisnet.org/cais/vol28/iss1/2
8
Madden, M. (2014). Public
perceptions of privacy and security in
the post-Snowden era.
Retrieved from
http://www.pewinternet.org/2014/11/1
2/public-privacy-perceptions/
Madden, M., & Rainie, L. (2015).
Americans’ attitudes about privacy,
security and
surveillance. Retrieved from
http://www.pewinternet.org/2015/05/2
0/americans-attitudes-
about-privacy-security-and-
surveillance/
Marwick, A. E., & boyd, d. (2014).
Networked privacy. How teenagers
negotiate context in
social media. New Media & Society,
16(7), 1051–1067.
https://doi.org/
10.1177/1461444814543995
Masur, P. K. (2019). Situational
privacy and self-disclosure:
Communication processes in
online environments. Cham,
Switzerland: Springer International
Publishing.
Metzger, M. J. (2004). Privacy, trust,
and disclosure: Exploring barriers to
electronic
commerce. Journal of Computer-
Mediated Communication, 9(4).
https://doi.org/10.1111/j.1083-
6101.2004.tb00292.x
THE SOCIAL MEDIA PRIVACY
MODEL 31
Moor, J. H. (1997). Towards a theory
of privacy in the information age.
ACM SIGCAS
Computers and Society, 27(3), 27–32.
https://doi.org/10.1145/270858.27086
6
Moore, B. (1984). Privacy: Studies in
social and cultural history. Armonk,
N.Y.: M.E.
Sharpe.
Nissenbaum, H. (2010). Privacy in
context: Technology, policy, and the
integrity of social
life. Palo Alto, CA: Stanford
University Press.
Ochs, C., & Büttner, B. (2018). Das
Internet als "Sauerstoff" und
"Bedrohung" [The internet
as oxygen and menace]. In M.
Friedewald (Ed.), DuD-Fachbeiträge.
Privatheit und
selbstbestimmtes Leben in der
digitalen Welt [Privacy and a self-
determined life in a
digital world] (pp. 33–80).
Wiesbaden, Germany: Springer
Vieweg.
Papacharissi, Z. (2010). A private
sphere: Democracy in a digital age.
Cambridge: Polity
Press.
Pedersen, D. M. (1999). Model for
types of privacy by privacy functions.
Journal of
Environmental Psychology, 19, 397–
405.
https://doi.org/10.1006/jevp.1999.014
0
Petronio, S. (2002). Boundaries of
privacy. Albany, NY: State University
of New York Press.
Qian, H., & Scott, C. R. (2007).
Anonymity and self-disclosure on
weblogs. Journal of
Computer-Mediated Communication,
12(4), 1428-1451.
https://doi.org/10.1111/j.1083-
6101.2007.00380.x
Quinn, K. (2014). An ecological
approach to privacy: “Doing” online
privacy at midlife.
Journal of Broadcasting & Electronic
Media, 58(4), 562–580.
https://doi.org/
10.1080/08838151.2014.966357
Quinn, K. (2016). Why we share: A
uses and gratifications approach to
privacy regulation in
social media use. Journal of
Broadcasting & Electronic Media,
60(1), 61–86.
https://doi.org/
10.1080/08838151.2015.1127245
THE SOCIAL MEDIA PRIVACY
MODEL 32
Rainie, L., Kiesler, S., Kang, R., &
Madden, M. (2013). Anonymity,
privacy, and security
Online. Retrieved from
http://www.pewinternet.org/2013/09/0
5/anonymity-privacy-and-
security-online/
Ramirez, A., Bryant, S., Erin, M.,
Fleuriet, C., & Cole, M. (2015). When
online dating
partners meet offline: The effect of
modality switching on relational
communication
between online daters. Journal of
Computer-Mediated Communication,
20(1), 99–114.
https://doi.org/10.1111/jcc4.12101
Saeri, A. K., Ogilvie, C., La Macchia,
S. T., Smith, J. R., & Louis, W. R.
(2014). Predicting
facebook users' online privacy
protection: Risk, trust, norm focus
theory, and the theory of
planned behavior. The Journal of
Social Psychology, 154(4), 352–369.
https://doi.org/
10.1080/00224545.2014.914881
Sarikakis, K., & Winter, L. (2017).
Social media users’ legal
consciousness about privacy.
Social Media + Society, 3(1), 1-14.
https://doi.org/10.1177/205630511769
5325
Schäwel, J. (2020). How to raise
users’ awareness of online privacy.
Duisburg, Germany:
University of Duisburg-Essen.
Schmidt, J.-H. (2014). Twitter and the
rise of personal publics. In K. Weller,
A. Bruns, J.
Burgess, M. Mahrt, & C. Puschmann
(Eds.), Digital formations: Vol. 89.
Twitter and
society (pp. 3–14). New York: Peter
Lang.
Seubert, S., & Becker, C. (2019). The
culture industry revisited:
Sociophilosophical
reflections on ‘privacy’ in the digital
age. Philosophy & Social Criticism,
45(8), 930–947.
https://doi.org/
10.1177/0191453719849719
Sevignani, S. (2016). Privacy and
capitalism in the age of social media.
Routledge research
in information technology and society:
Vol. 18. New York, NY: Routledge.
Slater, M. D. (2007). Reinforcing
spirals: The mutual influence of media
selectivity and
media effects and their impact on
individual behavior and social
identity. Communication
Theory, 17(3), 281–303.
https://doi.org/10.1111/j.1468-
2885.2007.00296.x
THE SOCIAL MEDIA PRIVACY
MODEL 33
Smith, H. J., Dinev, T., & Xu, H.
(2011). Information privacy research:
an interdisciplinary
review. Mis Quarterly, 35(4), 989–
1016.
Spottswood, E. L., & Hancock, J. T.
(2017). Should I share that?
Prompting social norms that
influence privacy behaviors on a
social networking site. Journal of
Computer-Mediated
Communication, 22(2), 26.
https://doi.org/10.1111/jcc4.12182
Taneja, A., Vitrano, J., & Gengo, N. J.
(2014). Rationality-based beliefs
affecting individual’s
attitude and intention to use privacy
controls on facebook: An empirical
investigation.
Computers in Human Behavior, 38,
159–173.
https://doi.org/10.1016/j.chb.2014.05.
027
Tavani, H. T. (2007). Philosophical
theories of privacy: Implications for
an adequate online
privacy policy. Metaphilosophy,
38(1), 1–22.
https://doi.org/10.1111/j.1467-
9973.2006.00474.x
Tavani, H. T., & Moor, J. H. (2001).
Privacy protection, control of
information, and privacy-
enhancing technologies. ACM
SIGCAS Computers and Society,
31(1), 6–11.
https://doi.org/
10.1145/572277.572278
Teutsch, D., Masur, P. K., & Trepte,
S. (2018). Privacy in mediated and
nonmediated
interpersonal communication: How
subjective concepts and situational
perceptions
influence behaviors. Social Media +
Society, 4(2), 1-14.
https://doi.org/
10.1177/2056305118767134
Treem, J. W., & Leonardi, P. M.
(2012). Social media use in
organizations. Exploring the
affordances of visibility, editability,
persistence, and association.
Communication
Yearbook, 36, 143–189.
https://doi.org/10.1080/23808985.201
3.11679130
Trepte, S., & Masur, P. K. (2017).
Need for privacy. In Zeigler-Hill, V.,
Shakelford, T. K.
(Ed.), Encyclopedia of personality and
individual differences. London, UK:
Springer.
https://doi.org/10.1007/978-3-319-
28099-8_540-1
Trepte, S., & Reinecke, L. (Eds.).
(2011). Privacy online. Perspectives
on privacy and self-
disclosure in the social web. Berlin,
Germany: Springer.
THE SOCIAL MEDIA PRIVACY
MODEL 34
Trepte, S., Reinecke, L., Ellison, N.
B., Quiring, O., Yao, M. Z., &
Ziegele, M. (2017). A
cross-cultural perspective on the
privacy calculus. Social Media +
Society, 3(1), 1-13.
https://doi.org/
10.1177/2056305116688035
Tsay-Vogel, M., Shanahan, J., &
Signorielli, N. (2018). Social media
cultivating perceptions
of privacy: A 5-year analysis of
privacy attitudes and self-disclosure
behaviors among
Facebook users. New Media &
Society, 20(1), 141–161.
https://doi.org/
10.1177/1461444816660731
Utz, S., & Krämer, N. (2009). The
privacy paradox on social network
sites revisited. The role
of individual characteristics and group
norms. Journal of Psychosocial
Research on
Cyberspace, 3(2). Retrieved from
http://cyberpsychology.eu/view.php?
cisloclanku=2009111001&article=2
Vitak, J. (2012). The impact of
context collapse and privacy on social
network site
disclosures. Journal of Broadcasting &
Electronic Media, 56(4), 451–470.
https://doi.org/
10.1080/08838151.2012.732140
Waldman, A. E. (2018). Privacy as
Trust. Cambridge, UK: Cambridge
University Press.
https://doi.org/
10.1017/9781316888667
Walther, J. B. (1996). Computer-
mediated communication. Impersonal,
interpersonal, and
hyperpersonal interaction.
Communication Research, 23(1), 3–
43.
https://doi.org/
10.1177/009365096023001001
Warren, S. D., & Brandeis, L. D.
(1890). The right to privacy. Harvard
Law Review, 4(5),
193–220.
Westin, A. F. (1967). Privacy and
freedom. New York, NY: Atheneum.
Wolfe, M., & Laufer, R. (1974). The
concept of privacy in childhood and
adolescence. In S.
T. Margulis (Ed.), Man-environment
interactions: Evaluations and
applications (pp. 29–
54). Stroudsburg, PA: Dowden,
Hutchinson & Ross.
THE SOCIAL MEDIA PRIVACY
MODEL 2
Abstract
Privacy has been defined as the
selective control of information
sharing, where control is key. For
social media, however, an individual
user’s informational control has
become more difficult. In
this theoretical article, I review how
the term control is part of theorizing on
privacy, and I
develop an understanding of online
privacy with communication as the
core mechanism by which
privacy is regulated. The results of this
article’s theoretical development are
molded into a
definition of privacy and the social
media privacy model. The model is
based on four
propositions: Privacy in social media is
interdependently perceived and valued.
Thus, it cannot
always be achieved through control. As
an alternative, interpersonal
communication is the
primary mechanism by which to ensure
social media privacy. Finally, trust and
norms function as
mechanisms that represent crystallized
privacy communication. Further
materials are available at
https://osf.io/xhqjy/
Keywords: privacy, control, social
media, affordances, communication,
social media
privacy model, definition of privacy
THE SOCIAL MEDIA PRIVACY
MODEL 3
The Social Media Privacy Model:
Privacy and Communication in the
Light of Social Media
Affordances
In historical and current theories about
privacy, control has been perceived as
an
important defining term. The majority
of privacy scholars understand control
as the means by
which to regulate and finally
experience privacy (Altman, 1975;
Burgoon, 1982; Petronio, 2002).
The underlying assumption is that the
more users can control access to their
personal lives, or—
more technically—to their data, the
more privacy they experience. Also, the
most current
understanding held by social media
users is that they need control to
achieve privacy and
informational self-determination
(Marwick & boyd, 2014). The majority
of 80% to 90% of U.S.
Americans (Madden & Rainie, 2015)
and Europeans (European Commission,
2015) say that it is
important to them to be in control of
determining who can obtain
information about them and
what information is collected about
them (see also Sarikakis & Winter,
2017).
There is no question that users face
decreasing informational control while
communicating via social media. Due
to their networked nature, social media
applications do not
allow users to control what friends,
acquaintances, institutions, or
companies do with the
information, pictures, and stories that
are shared online (Marwick & boyd,
2014). Further, social
media communication takes place in
larger and larger parts of users’ lives.
All of the more
current applications and devices
aggregate information and exert
automatic control (Anderson,
Rainie, & Duggan, 2014). As a reaction
to the increasing amounts of data that
are exchanged and
the sociality of such data, 91% of users
perceive that they have lost control
over how their
personal information is collected and
used by friends, acquaintances, and
colleagues (Quinn,
2014) and especially by companies and
governments (Madden, 2014).
These two observations—the
understanding of privacy as control on
the one hand and the
experience of decreasing control over
information while using social media
on the other—can be
THE SOCIAL MEDIA PRIVACY
MODEL 4
termed a control issue of privacy. In the
remainder of this article, I will suggest
an understanding
of privacy that is adequate for social
media use and the requirements
emerging from this issue.
The Relationship of Privacy and
Control
Privacy is a concept that has been
considered and defined in very
different disciplines,
from descriptive, empirical, and
normative perspectives (Sevignani,
2016; Trepte & Reinecke,
2011). In earlier days, privacy was
considered a human right and identified
as the “right to be let
alone” (Warren & Brandeis, 1890, p.
75). Later and more specifically,
privacy was defined as
“the claim of individuals, groups, or
institutions to determine for themselves
when, how, and to
what extent information about them is
communicated to others” (Westin,
1967, p. 7) or “the
selective control of access to the self”
(Altman, 1975, p. 24).
Informational control has only seldom
been defined, but the most common
definitions
touch either a static or behavioral
aspect of control: Informational control
foremost means that
owners of a certain piece of
information have a choice over
whether, when, and to what extent
they will disclose or withhold personal
information (Crowley, 2017; Tavani,
2007). Here control
is static, a question of more or less, yes
or no. It can be understood as an option
or an available
mechanism. Then, control can be
exerted actively (e.g., restricting access
to information,
audience segregation, self-censorship,
encryption), ambiguously (e.g.,
softening the truth,
obfuscating information, or engaging
in other forms of partial disclosure), or
passively (e.g.
unintentionally omitting information)
(Crowley, 2017; Ochs & Büttner,
2018). In this rather
behavioral understanding,
informational control is executed and
experienced by the individual
person. In both perspectives, control is
centered around the individual and
individual decision
making.
The majority of privacy theories are
devoted to two—somewhat
contradictory—
paradigms: I will call the first
paradigm “privacy as control,” because
here, privacy and control
THE SOCIAL MEDIA PRIVACY
MODEL 5
are strongly connected, and, the second
paradigm “privacy and control,”
because here, privacy
and control are treated as separate
constructs with conditional
relationships. I will then suggest a
third perspective that redefines the
meaning and impact of control and the
conditions among
which control becomes relevant. This
perspective will be summarized in the
social media privacy
model.
Paradigm 1: Privacy as Control
In the seminal work by Altman (1975)
and the privacy regulation model of
self-disclosure
(Derlega, Metts, Petronio, & Margulis,
1993), control was set forth as the
crucial mechanism of
privacy. More recent
conceptualizations have also referred to
control as a precondition of privacy
(Petronio, 2002). Even in their very
first conceptualizations of privacy,
Warren and Brandeis
(1890) referred to privacy as the right
to control what others publish about
oneself. In an
overview of privacy theories, Smith,
Dinev, and Xu (2011) investigated 448
publications on
privacy. They found that—besides an
understanding of privacy as a value—
the cognate-based
understanding of privacy as control has
dominated the social sciences.
The vast majority of privacy scholars
have referred to control as a dynamic
behavior in
the process of privacy regulation to
grant access or to deny access. Altman
(1975) suggested a
process model with three steps: First,
an individual assesses the desired level
of privacy; then the
individual eventually regulates privacy
by controlling interpersonal
boundaries; and then, the
individual again assesses the achieved
level of privacy. In his flow-model the
crucial role that
was assigned to control becomes
apparent. On the basis of this notion,
Petronio (2002) articulated
how control is the engine of privacy
management. In her understanding, an
individual jointly
manages and coordinates rules with
others while interacting with them.
Here again, control is not
only the behavior through which
privacy can be gained, but control is
also the means by which to
measure the status quo of privacy, and
in turn, it will foster the extent to which
privacy regulation
THE SOCIAL MEDIA PRIVACY
MODEL 6
is further engaged in through an
exertion of control.
Privacy scholars have also referred to
the question of what is being
controlled. Here, in
particular, the control of access to
boundaries and the control of the flow
of an interaction were
addressed as the topics or processes
that needed to be controlled (Johnson,
1974; Wolfe &
Laufer, 1974). Further, control over
stimuli that impinge upon a person
were articulated as things
that need to be controlled (Wolfe &
Laufer, 1974). Margulis (1974)
explained that control refers
to all matters being exchanged between
individuals: “Privacy, as a whole or in
part, represents the
control of transactions between
person(s) and other(s)…” (p. 77).
In some theories, control has been used
almost interchangeably with privacy.
For
example, Derlega et al. (1993) stated
that “…privacy represents control over
the amount and kind
of information exchange that persons
have with one another” (p. 67). Then,
Burgoon (1982)
differentiated between four dimensions
of privacy, all of which refer to how
much control an
individual has: Possessing physical
privacy refers to whether and how
much control an individual
perceives to have over physical
boundaries. Social privacy refers to
how much control an
individual perceives to have over the
access of others to the person’s
environments.
Psychological privacy refers to how
much control an individual perceives to
have over emotional
and cognitive input and output. Finally,
informational privacy refers to how
much control an
individual perceives to have over the
use of personal data. In this
conceptualization, the ability to
exert control is the key to an
individual’s privacy perception and, in
turn, regulation. Many
empirical studies have addressed the
relationship between control and
privacy, but only a
minority of studies have supported the
notion that privacy behavior is related
to informational
control (Brandimarte, Acquisti, &
Loewenstein, 2013).
In sum, studies that have been based on
this first paradigm have underscored
the idea that
individuals exert control to achieve
privacy. In turn, privacy should be
achieved if a certain level
THE SOCIAL MEDIA PRIVACY
MODEL 7
of control is successfully executed and
maintained as the status quo. However,
these
conceptualizations of privacy suggest a
linear relationship between privacy and
control. They
assume that “…the more one has
control over this information exchange,
the greater the amount
of privacy one has in a social
relationship” (Derlega et al., 1993, p.
67). However, previous
empirical research did not find a linear
relationship between privacy and
control. Hence, there is
a mismatch between the theoretical
assumption that privacy and control are
closely related on the
one hand and the rare empirical data
supporting this notion on the other.
Paradigm 2: Privacy and Control
In social media, an individual person
cannot easily execute control because
personal
information is exchanged between
many parties and with a broad range of
applications. Users
experience social media as more
confusing, demanding, and complex
with regard to the control
that they have over their personal
information than face-to-face
communication (Marwick
& boyd, 2014; Quinn, 2014). Woo
(2016) expressed this confusion while
mimicking the
presumed thoughts of a user: “Please
do contact me and give me benefits, but
I still do not want
to fully give up my control (but I do
not know how to have that control)” (p.
954). In other
words, users want to take advantage of
the networked nature of social media,
are painfully aware
of the deficits in control, but have not
yet found solutions for how to embrace
their needs for both
gratification and informational control.
This process of weighing privacy risks
and social
gratifications has also been
investigated under the umbrella of the
privacy calculus (Trepte et al.,
2017).
To follow up on the sentiment that
users wish to have informational
control but that
control seems to contradict the
networked nature of social media,
Moor (1997) and later Tavani
(2007) reflected on the relationship
between privacy and control. They
argued that control and
privacy should be seen as separate
constructs and that privacy and control
serve very different
THE SOCIAL MEDIA PRIVACY
MODEL 8
functions. With the Restricted
Access/Limited Control (RALC) theory
of privacy, these authors
defined privacy in terms of the
individual’s protection from intrusion
and information gathering
by third parties. They argued that
control in the information age is
impossible and further that
“We can have control but no privacy,
and privacy but no control” (Tavani &
Moor, 2001, p. 6).
They suggested that privacy and control
should be separated such that privacy is
a concept and a
value that is defined by being protected
from information access by others,
whereas control is one
mechanism that can be used to manage
and justify privacy. Control may be
exerted through
choice, consent, or correction. In the
flow of the exchange of digital
information, people choose
situations according to their
communication goals, level of access,
and emerging privacy needs
(Trepte & Masur, 2017); then, privacy
is maintained through the processes of
consent, and
finally, corrections allow people to
restore their privacy when it gets lost or
threatened. For the
upcoming social media privacy model,
I will refer to this notion that control is
one mechanism
among others, and I will explain that
for all processes (i.e., choice, consent,
correction),
individuals have to get in touch with
others and communicate their motives
and aims.
With her theory of contextual integrity,
Nissenbaum (2010) also addressed the
contextual
requirements as boundary conditions,
regardless of whether control is a
functional mechanism or
not. She suggested that the two sets of
theories be married: those referring to
privacy as a
constraint on access and those referring
to privacy as a form of control. In her
theory of
contextual integrity, Nissenbaum
(2010) described control as one
“transmission principle” (p.
145) that defines how information is
exchanged. Other transmission
principles are reciprocity and
confidentiality. Control as a
transmission principle is appropriate
only if it fits into the particular
context, the subject that users are
talking about, the type of information
that is to be exchanged,
and the actors they communicate with.
From this point of view, there can be
privacy without
control in situations in which control is
inappropriate or not available (Laufer
& Wolfe, 1977;
THE SOCIAL MEDIA PRIVACY
MODEL 9
Slater, 2007).
Current privacy theories pushed the
idea of control as a dynamic attribute of
the situation
one crucial step further. According to
Dienlin’s (2014) privacy process
model, individuals assess
the controllability of the context and
behavior. Masur (2019) adds an
analysis of what is being
controlled by entangling interpersonal
(e.g., the relationship between
interaction partners) and
external factors (e.g., the architecture of
a room) in his theory of situational
privacy and self-
disclosure. Depending on the situation,
these interpersonal and external factors
can be controlled
to different degrees, and in turn, they
can elicit differential levels of self-
disclosure. Self-
disclosure will be understood as “the
intentional communication of
information about the self to
another person or group of people”
(Masur 2019, p. 70) in the remainder of
this article.
The notion that privacy and control are
not necessarily connected has been
supported by
previous research (Saeri, Ogilvie, La
Macchia, Smith, & Louis, 2014). For
example, Zlatolas,
Welzer, Heričko, and Hölbl (2015)
demonstrated that privacy norms,
policies, and awareness but
not privacy control were related to the
self-disclosures of N = 673 Slovenian
Facebook users. In
a U.S. sample of N = 249 Facebook
users, Taneja, Vitrano, and Gengo
(2014) found that
perceived behavioral control and the
intention to engage in privacy-related
behavior were
unrelated. Eastin et al. (2016)
investigated how different variables
predicted mobile commerce
activity and found that control was the
one that explained the smallest amount
of variance. In
particular, trust and attitude toward
mobile commerce were more important
predictors than
control. In sum, individuals who
perceived that they had control over
their personal data did not
necessarily feel they had more privacy
and did not increasingly engage in self-
disclosure. Further,
trust and norms were identified as
important alternative mechanisms of
privacy (Brandimarte et
al., 2013; Eastin et al., 2016;
Nissenbaum, 2010; Zlatolas et al.,
2015). I will refer to both
findings in the social media privacy
model.
THE SOCIAL MEDIA PRIVACY
MODEL 10
The Interplay of Affordances, Control,
and Privacy
The lack of a relation between privacy
and control might hint that the interplay
of the two
variables is not linear and is actually
more complex (Laufer & Wolfe, 1977).
The relation
between control and privacy should
become clearer if the social media
boundary conditions that
make control a functional mechanism
in one situation but impractical in
another are elucidated.
Social Media and its Boundary
Conditions for Privacy
Carr and Hayes (2015) defined social
media as “…Internet-based channels
that allow
users to opportunistically interact and
selectively self-present, either in real-
time or
asynchronously, with both broad and
narrow audiences who derive value
from user-generated
content and the perception of
interaction with others” (p. 50). They
further pointed out that users’
interaction will increasingly be
influenced by social media affordances.
Further, social media has
been characterized by its content, its
users, and its infrastructure in previous
definitions (Howard
& Parks, 2012). The most prominent
examples of social media are social
network sites (e.g.,
Facebook, Instagram, LinkedIn,
Google+), multimedia platforms (e.g.,
Youtube, Slideshare,
Soundcloud), weblogs (e.g., personal
diaries of mothers, scholars, or self-
appointed or paid
influencers), and microblogs (e.g.,
Twitter). In struggling to develop a
definition of social media,
scholars have pointed to the fact that
social media channels are formally
understood as methods
of mass communication but that they
primarily contain and perpetuate
personal user interactions
(Carr & Hayes, 2015; Papacharissi,
2010). In this sense, social media can
be referred to as
personal publics (Schmidt, 2014). As a
consequence, users cannot always
clearly define the
somewhat blurred lines between
personal and public or between private
and professional
communication. They feel that contexts
collapse and converge (Papacharissi,
2010; Vitak, 2012).
In sum, social media is characterized
by the following boundary conditions:
the content, its flow
and further uses (Howard & Parks,
2012); the communication practices
that users perceive as
THE SOCIAL MEDIA PRIVACY
MODEL 11
their options for exerting control or for
achieving privacy with other means;
and social media
affordances (Carr & Hayes, 2015). In
the following, I will analyze of how the
interplay of these
boundary conditions is related to
control and how it determines different
privacy perceptions and
behaviors. Tables 1 and 2 in the
Supplemental summarize this
theoretical development.
Social Media Boundary Condition 1:
Content, its Flow and Uses
What exactly does an individual strive
to control? The sooner we come to
understand
what individuals strive to control, the
better we can evaluate whether control
can be experienced
in social media. According to most
studies, personal information refers to
the content that people
strive to control in order to maintain
their privacy in social media. Metzger
(2004) referred to
personal information as the content to
be controlled. Quinn (2014) suggested
different layers of
how privacy can be maintained. On the
“content layer,” users’ experience of a
lack of control
leads them to limit the information they
post or even to post false information.
Sarikakis and
Winter (2017) added on the basis of
their qualitative work that users do not
differentiate between
personal information and personal data.
Instead, they define the degree of
intimacy or privacy
needed for a certain piece of
information or data.
Then, besides personal information, the
flow and use of the content needs to be
considered. Social media advocates
specifically address where online
information is forwarded,
archived, and sold. They emphasize
users’ concerns about how little control
they have over the
flow and use of personal information
(Marwick & boyd, 2014; Quinn, 2014;
Tsay-Vogel,
Shanahan, & Signorielli, 2018). This
refers to the forms personal
information takes, to where it
ends up and how it is used. In the
following, personal information, its
flow, and further uses will
be considered as what users strive to
control.
Social Media Boundary Condition 2:
Practices of Control
Actively exerting control expresses
informational self-determination,
which implies
THE SOCIAL MEDIA PRIVACY
MODEL 12
having power over information and
agency in decisions regarding this
information. In turn, a loss
of control would mean that other
behavioral options are out of reach and
that individuality (Buss,
2001), power, and agency are severely
threatened (Brummett & Steuber,
2015). Control also
comes along with risk avoidance:
Users have identified the most
important pieces of information
that they want to control as the contents
of their emails, the contents of their
online chats, and
their location (Cho, Lee, & Chung,
2010). As long as they have control
over this information,
they can avoid being harassed, bullied,
financially exploited by companies, or
surveilled by
governmental institutions.
How is control executed and achieved?
First, informational control can be
identified as an
individual’s perception that he or she
has a choice about whether to withhold
or disclose
information (Crowley, 2017; Johnson,
1974). Choice is the first step and
determines whether
control can be exerted and to what
degree (Wolfe & Laufer, 1974). Then,
in the next step, when
choice is available, it has to be put into
action. Two active practices of control
in social media are
consent and correction (Tavani, 2007).
Consent refers to the extent to which
users agree that a
certain piece of information will be
passed along. Correction means that
users are able to
withdraw from this agreement.
Whereas choice was identified long
ago as a practice of control
(Johnson, 1974), consent and correction
were suggested as social media
practices (Tavani, 2007).
Further, informational control can also
be put into practice by the selective
sharing of
information, self-censorship, audience
segregation, and encryption (Ochs &
Büttner, 2018). All
of these options are directed by the
individual and can be considered to be
ego-centered. It will be
important to also find terms for
interpersonal privacy regulation
behaviors.
Social Media Boundary Condition 3:
Affordances
Social media can be characterized by
affordances. The term affordance,
initially
suggested by Gibson (1979/2014),
means that the environmental
characteristics of a certain entity
THE SOCIAL MEDIA PRIVACY
MODEL 13
are not static but differently perceived,
experienced, and as such shaped by
humans. In the case of
social media, this understanding is
more than suitable. Of course the
process of shaping or
“furnishing” (Gibson, 1979/2014, p.
78) social media influences its further
uses. For example,
teenage users regulate their privacy
through social steganography, an
idiomatic language that is
understood only by their peers but not
their parents instead of managing their
audiences by
editing their friends lists and
systematically blocking their parents or
certain peers (boyd, 2014).
Inventing and using this kind of
idiomatic language might influence
users’ style of
communication and interactions on
social media. A selection of four
affordances have repeatedly
been shown to be particularly important
for social media: anonymity,
editability, association, and
persistence (boyd, 2008; Evans, Pearce,
Vitak, & Treem, 2017; Treem &
Leonardi, 2012). The
affordances of anonymity and
editability allow users to exert control.
By contrast, the affordances
of association and persistence
challenge users’ ability to exert
control. Both clusters of
affordances—those enhancing (Table
1) as well as those challenging control
(Table 2)—have
different implications for how content
is controlled, which practices of
control are available, and
how they affect privacy regulation in
social media realms. In the following, I
intertwine the
results from previous research on
privacy, the content and practices of
control, and, affordances.
Anonymity. The affordance of
anonymity describes the idea that other
social media
agents such as other private people,
institutions, or companies do not know
the source of the
message or the sender (Evans et al.,
2017). For social media use, being
unknown to others and
anonymously using social media is rare
(Rainie, Kiesler, Kang, & Madden,
2013). Nevertheless,
anonymity is still appreciated
occasionally. For example, users
commenting on other users’ posts
in self-help or political forums can
decide to keep their interactions
anonymous. In addition, users
of dating websites might at least partly
or temporarily use such sites
anonymously (Ramirez,
Bryant, Erin, Fleuriet, & Cole, 2015).
Anonymity is not a question of “on” or
“off” but is flexible
THE SOCIAL MEDIA PRIVACY
MODEL 14
and scalable (Evans et al., 2017).
Anonymity has spurred enormous
attention in research on
computer-mediated communication
(CMC; Joinson, 2001). Here,
anonymity is specifically
considered a means to control the
access of certain individuals (Qian &
Scott, 2007). Further,
while being online anonymously, an
individual can deliberately decide what
information to share,
with whom to share it, and what to
withhold (Qian & Scott, 2007). While
being online
anonymously, the receiver of a
message cannot provide the CMC with
face-to-face behaviors,
and as such, the control lies in the
hands of the sender (Ben-Ze'ev, 2003).
Control over content, its flow, and its
uses is possible because, in a state of
anonymity, all
of these are disconnected from the user.
Although full anonymity is not afforded
by social media,
U.S. American users occasionally keep
their posts anonymous while using
social media with the
clear aim of exerting control (Rainie et
al., 2013). For example, they disguise
their location or
delete cookies so that companies
cannot identify them. An interview
partner in Sarikakis and
Winters’ (2017) study said: “Well
when I use fake names or email
addresses and fake birthdates I
think that’s the only control you can
try to have” (p. 9). Two aspects,
though, might mitigate the
perception of control. First, users know
that they leave traces behind, and once
they have posted
content online—even if it was posted
anonymously—they might be traceable
because of other
information they left online; second,
anonymity is usually used only to some
extent (e.g., by
leaving one’s name but not one’s
address), and users acknowledge that
with partial anonymity,
they experience only partial control,
and in turn, only partial privacy (Rainie
et al., 2013). What
are the available practices users have to
exert control? First, being online
anonymously can be
considered a question of choice. Woo
(2016) suggested that anonymity—or
by contrast,
identifiability—is the most important
issue in online privacy. He argued that
on the social web,
users should have “islands” of
anonymity. He has even encouraged
people to lie and to have
secrets with the aim of regaining
control and autonomy. Then, however,
in an anonymous setting,
THE SOCIAL MEDIA PRIVACY
MODEL 15
consent and corrections are somewhat
disconnected from the user as these are
identity-related
practices.
Anonymity can be an enactment of
control by disguising or lying about
one’s identity or
by leaving it unspecified for some
applications and occasions. Also,
people may leave the source
of their own messages unknown. And,
of course, these enactments of control
might be applied
when interacting with some users but
not with others. In sum, the affordance
of anonymity is
related to informational control, which
has also been demonstrated in
empirical studies (Fox &
McEwan, 2017).
In previous research on privacy,
anonymity has played a crucial role.
Here, it was even
understood as a “type” of privacy
among other types such as solitude,
intimacy, or reserve
(Pedersen, 1999; Westin, 1967). In
sum, anonymity allows users to exert
control and, in turn, it
increases an individual’s subjective
experience of privacy by not being
identifiable at all or by
selectively presenting parts of one’s
own identity (Smith et al., 2011; Woo,
2016).
Editability. While using social media,
users interact remotely in terms of time
and space.
This gives them the chance to edit their
online communications before and
after the
communications are seen by others
(Treem & Leonardi, 2012). Editability
is an affordance that
was previously addressed as important
for CMC in the hyperpersonal model
(Walther, 1996):
Senders of online messages self-
selectively present themselves online
by primarily transmitting
cues and messages that they want
others to get to know and that put them
in a positive light. In
addition, although editing one’s identity
is part of any social interaction, social
media platforms
offer more freedom in terms of what,
how, and when these interactions are
edited. Editing allows
users to rehearse, change, package, or
literally craft a message and, in turn, to
rehearse, change,
and craft their personal appearance.
Editability allows the message sender
control over content and its flow and
uses because
THE SOCIAL MEDIA PRIVACY
MODEL 16
users have the chance to ponder the
consequences of their posts (Treem &
Leonardi, 2012).
Further, users may control the flow and
further use of their content by
articulating the lists of
online friends that they connect with or
by using a private or public profile
(Ellison & boyd,
2013). The availability of control
practices is highly supported by social
media’s affordance of
editability (Fox & McEwan, 2017).
Users have a choice to either intuitively
post their thoughts or
to pictorially represent their nonverbal
cues. Editing can be considered an
active enactment of
control because users deliberately
choose what to reveal to certain
audiences or what to withhold
(Crowley, 2017). Further, corrections
of one’s own posts and decisions are
possible and can also
be conceived as an enactment of
control (Crowley, 2017).
Control over the flow of information
might also foster subjective
experiences of privacy. In
privacy research, exerting control over
the flow of an interaction was often
understood
synonymously with control or as a
transmission principle that guaranteed
privacy (Nissenbaum,
2010).
Association. Social media platforms are
primarily used because they offer users
the chance
to connect with others and to stay in
touch. Users have articulated the idea
that communication is
their most important motive for using
social network sites (Quinn, 2016).
Consequently, the most
important affordance of social media is
the associations that are created or
maintained between
interaction partners (Ellison & boyd,
2013; Treem & Leonardi, 2012).
The affordance of association and the
chance to exert control over content, its
flow and
uses seem strikingly incompatible. In
their cross-sectional study of N = 875
Mechanical Turk
workers, Fox and McEwan (2017)
demonstrated that informational
control and network
association were negatively related.
Control is exerted by an individual
person, and as such, the
individual person is at the center if not
entirely responsible for achieving
privacy via control. This
is clearly expressed in users’ current
understanding of control. For example,
participants in
THE SOCIAL MEDIA PRIVACY
MODEL 17
Sarikakis and Winters’ (2017) study
identified the individual as the
legitimate “controller” of
privacy (p. 6). Also, boyd (2014)
argued that exerting control with the
aim of achieving privacy
requires the individual’s full
consideration, power, knowledge, and
skills. The only control
practice available would be to
completely withdraw (i.e., not to
participate) and to accept the
disadvantages that come along with
such a decision. Then, ambiguous and
passive enactments of
control might be used, but these have
the same disadvantages.
In sum, control as an issue and a
concept takes the perspective of the
individual. In other
words, it is an “ego-centered need”
(Papacharissi, 2010, p. 144). By
contrast, association is an
interindividual concept. Very likely,
one person’s control ends at the point
where another
person’s control starts. Hence, as much
as the social web is an interdependent
realm involving
other parties, control is not the means
to ensure or exert privacy. On the basis
of these
considerations—and this will be
important for the upcoming
propositions on social media
privacy—other means and mechanisms
to guarantee the subjective experience
of privacy have
become necessary: Users communicate
with each other to ensure privacy. And
further, if
communication is not possible, they
choose communication partners—
individuals, organizations,
institutions—that they can trust. Trust
has been shown to be crucial for
ensuring the perception of
privacy and subsequent online
disclosure (Metzger, 2004). Trust can
be established by personal
communication (Petronio, 2002) and by
norms that the individual user can rely
on (Saeri et al.,
2014). As such, in the upcoming
propositions and the social media
privacy model, trust,
communication, and norms are
conceptualized as the core mechanisms
to ensure privacy beyond
control.
Persistence. The affordance of
persistence addresses the durability of
online expressions
and content (boyd, 2014) and the idea
that after personal information is
published online, it is
automatically recorded and archived
and is consequently replicable (boyd,
2008). It means that
THE SOCIAL MEDIA PRIVACY
MODEL 18
data remain accessible in the same
form over long periods of time and for
diverse and unforeseen
audiences (Evans et al., 2017; Treem &
Leonardi, 2012). Some authors have
emphasized the
positive outcomes that persistence may
have, namely, that it allows knowledge
to be sustained,
creates robust forms of
communication, establishes the growth
of content (Treem & Leonardi,
2012), and allows for authenticity and
longevity (boyd, 2008).
However, as persistence comprises the
endless and infinite nature of online
data, it also
expresses a loss of informational self-
determination. Persistence seems
incompatible with
control. It evokes the idea that control
over content, its flow, and uses is
impossible because once
personal information is posted online,
it is no longer under the sender’s
control. With regard to
control practices, social media users
have the choice to completely
withdraw from online
interactions, to not post their
information, and thus to avoid its
persistence. At the same time, this
would prevent them from receiving the
myriad benefits that come along with
data sharing and
thus does not seem to be a question of
freedom of choice anymore. Also, as
soon as the choice is
made to participate, other control
practices such as consent and correction
significantly decrease.
Finally, once given, consent decreases
a person’s chances to subsequently
correct previous online
communication. Users know that they
do not have control over how persistent
their data will be,
and this significantly decreases their
subjective experience of privacy
(Rainie et al., 2013). The
lack of control over personal
information due to the persistence of
all kinds of online information
can be considered one of the key issues
of online lives and the subjective
experience of privacy
today. To react to users’ needs to
foresee and understand persistence
(Rainie et al., 2013),
communication, trust and norms seem
to be adequate ways to ensure privacy.
The Social Media Privacy Model
Social media privacy is based on
interpersonal processes of mutual
disclosure and
communication (Altman, 1975;
Petronio, 2002). Further, it can be
considered a value that is co-
THE SOCIAL MEDIA PRIVACY
MODEL 19
developed by engaging in
communication and that is expressed
by a shared perception
(Nissenbaum, 2010; Smith et al.,
2011). On the basis of these
considerations, I propose:
Proposition 1: Privacy is
interdependently perceived and valued.
In contrast to privacy, control is at the
center of the individual person. Control
is exerted
by the individual person, and it can be
considered to be ego-centered
(Papacharissi, 2010;
Sarikakis & Winter, 2017). Social
media platforms aim for
connectedness, interdependence, and
sociality and can be described as
social-centered (Ellison & boyd, 2013).
As a consequence,
social media privacy cannot be
sufficiently achieved by exerting
control. Other mechanisms are
necessary to ensure social media
privacy.
Proposition 2: Privacy cannot be
satisfactorily achieved by exerting
control in social media.
Instead of striving for control as an
end-game of privacy, the opposite is
necessary to
experience privacy in social media.
Users need to constantly communicate
with each other as
well as with institutions and companies
to ensure their privacy. They need to
engage in both
interpersonal and deliberative
communication processes.
Interpersonal communication is
understood as interactions between
users as well as interactions between
the individual user and
others who represent third parties such
as institutions and companies.
Deliberation is defined as
either informal or institutionalized
interaction among internet users (and
eventually
representatives of governments,
institutions, or companies), involving
rational-critical decision
making and the earnest aim to find a
solution (Burkhalter, Gastil, &
Kelshaw, 2002).
Proposition 3: Interpersonal
communication is a mechanism by
which social media privacy can
interdependently ensured and put into
effect.
However, not all actions and steps of
online media use can be accompanied
by
communication processes. For many if
not most questions of privacy, people
can rely on past
experiences. Here, communication and
deliberation crystallize into a stable
relationship-based
THE SOCIAL MEDIA PRIVACY
MODEL 20
result or even solution, i.e. trust (or
mistrust) and norms (or anomia). Trust
can be defined as an
anticipation and expectation of how a
person or institution will behave and as
such reduces
uncertainty (Waldman, 2018, p. 4).
Trust has been shown to ensure privacy
on the basis of
longstanding communication and
reliable bonds (Saeri et al., 2014). Trust
can be conceived as
both a crucial factor of influence in
decisions over self-disclosure and as a
result of
communication (Saeri et al., 2014). In
turn, mistrust—which has not yet been
addressed in
privacy research—is a menace to
privacy and as such should initiate
communication. In a
qualitative study on privacy
perceptions, Teutsch, Masur, and
Trepte (2018) demonstrated that
participants perceived that they had lost
control and had substituted trust for
control. One of the
interview partners said, “Well, privacy
is absolute trust between conversational
partners and
absolute, absolute certainty that the
subject of conversation will stay within
this sphere” (p. 7).
Eichenhofer (2019) suggested that the
“trust paradigm” should point toward a
more current
perspective on privacy regulation via
trust in contrast to privacy regulation
via control or self-
determination.
In the case of privacy regulation, both
social and legislative norms come into
play (Gusy,
2018; Spottswood & Hancock, 2017;
Utz & Krämer, 2009). Social norms are
understood as
social pressure to engage in a certain
kind of behavior and are established by
what others approve
of (injunctive norms) and what they
actually do (descriptive norms) (Saeri
et al., 2014). Although
legislative norms are coined by
jurisprudence, regulated by law (and
not on the basis of observing
others), they share with social norms in
that they prescribe a certain behavior
and that they allow
for sanctions in case this behavior is
not shown. Previous research has
shown that trust, and
norms are the keys to obtaining privacy
(Marwick & boyd, 2014; Quinn, 2014).
To establish trust
and norms, of course, communication
is necessary.
Proposition 4: Trust and norms
function as privacy mechanisms that
represent
THE SOCIAL MEDIA PRIVACY
MODEL 21
crystallized privacy communication.
I suggest that control and
communication have found a new
balance in social media
communication: Control is losing
control and communication is gaining
power. In other words,
users do not solely rely on and strive
for control in the first place but strive to
communicate about
privacy to establish norms and trust
and even sometimes to regain control.
In fact, privacy’s
interdependence is expressed and put
forward by interpersonal
communication. This emerging
social turn in privacy theory is also
acknowledged in the definition of
privacy.
I define privacy by an individual’s assessments
of (a) the level of access to this
person in an interaction or relationship with
others (people, companies,
institutions) and (b) the availability of the
mechanisms of control,
interpersonal communication, trust, and norms
for shaping this level of access
through (c) self-disclosure as (almost intuitive)
behavioral privacy regulation
and (d) control, interpersonal communication,
and deliberation as means for
ensuring (a somewhat more elaborated)
regulation of privacy. In social media,
then, the availability of the mechanisms that
can be applied to ensure privacy
are crucially influenced by the content that is
being shared and the social media
affordances that determine how this content is
further used.
In the following, I will summarize the
theoretical rationale developed in this
article in the
chronology of a communicative
process. Further, I will show how the
individual’s privacy
assessments referred to in the four
propositions eventually lead to different
forms of privacy
regulation behaviors. The process is
illustrated in Figure 1. The following
steps are meant to
make the model accessible for
empirical investigation.
The first part of the flowchart refers to
the social media user’s subjective and
initial
assessment: All humans have
individual levels of access they
perceive as being more or less
adequate and comfortable. This
dispositional level of access is a
quantifiable dimension varying
THE SOCIAL MEDIA PRIVACY
MODEL 22
between high and low levels and
expressing the individual’s
dispositional willingness to self-
disclose. In contrast, the
communication goal is rooted in the
situation and defines what is to be
achieved in this particular situation.
The individual’s communication goals
in social media are
manifold and important to consider
when assessing online privacy. They
can most likely be
understood as a qualitative scenario of
what the individual user wants and
needs to communicate
in a certain situation. Hence, the point
of departure for each and any
consideration about privacy
are the more or less consciously asked
questions: How do I feel, what do I
need, and what is my
goal for this particular situation?
The second part of the model refers to
the social media boundary conditions
that are
encountered. Here, content and
affordances dynamically interact (as
indicated with the
multiplication sign) with an
individual’s initial assessment. The
individual weighs the ideal level
of access and his/her communication
goals against social media boundary
conditions by
considering what content is shared,
where; how it might flow from one user
or institution to
another; and, how it might be used.
Social media content becomes dynamic
as it is displayed and
shared. Affordances represent this
dynamic and, together with the
individual’s dispositions and
goals, shape the available privacy
mechanisms: control, trust, norms, and
interpersonal
communication. Hence, users assess
whether they have a choice, whether
they can rely on trust or
norms, or whether they will (have to)
engage in interpersonal
communication.
The third part of the model refers to the
subjective experience of privacy. The
individual
experiences a certain level of access
that results from the individual’s goals
on the one hand and
the social media boundary conditions
and privacy mechanisms that are
applied to actively
regulate privacy on the other. This
experience is here understood as the
rather unfiltered
accumulation of external stimuli and
internal needs and, then, results in a
more elaborated re-
assessment, i.e. the privacy perception
that can be verbalized, and is
empirically accessible.
THE SOCIAL MEDIA PRIVACY
MODEL 23
The privacy perception results in
different forms of privacy regulation
behaviors. First,
self-disclosure belongs to the most
intuitive regulation behaviors and
includes all information
intentionally shared (or not shared)
with others. And, for the case that the
privacy mechanism of
control is available and considered
adequate for a certain communication
goal, users exert control
actively and intentionally by restricting
access to information, audience
segregation, self-
censorship, encryption; or, rather
ambiguously by softening the truth,
obfuscating information.
When the privacy mechanisms do not
allow the deliberate and somewhat
egocentric privacy
regulation (i.e. when individuals do not
have at least partial control), other
regulation behaviors
come into play. The individual might
engage in interpersonal communication
or even deliberation
to negotiate and interdependently
regulate privacy. Interpersonal
communication, deliberation,
and control are meta-level regulation
behaviors that come into play when
privacy behaviors are
not intuitive, when contexts collapse, or
when a certain situation demands
further elaboration
and/or communication.
The communication process shown in
Figure 1 will take turns in a constant
flow of
assessments and re-assessments as soon
as the reactions of others lead to
changes in conditions or
when personal goals or needs change.
In what follows, I will discuss the
capabilities and
consequences of the model’s theoretical
propositions. What are the possible
effects and what are
the pitfalls and opportunities that will
occur if communication, trust, and
norms are substituted
for control?
Challenging the Social Media Privacy
Model
Social media use is at the heart of
human communication and offers all of
its many merits
such as practicing freedom of speech or
reaching out to people in critical living
conditions and
providing them with social support. In
addition, social media communication
offers particular
benefits because it is ubiquitous and
independent of time and location. As
such, for example, it
THE SOCIAL MEDIA PRIVACY
MODEL 24
allows people to communicate across
borders into the lands of friends, foes,
and fiends. All of
these merits of online communication
give its users enormous freedom. This
freedom is—and
this is again one of the major merits of
social media communication—often
independent of the
domestic or cultural situation of the
user. Freedom is historically closely
linked to privacy. In
mediaeval times, only those who had
land were free (Moore, 1984). And
only those who
possessed land had the freedom to
withdraw, grant, or restrict access to
their land. In turn, the
“unfree,” who did not have their own
land or possessions, did not have the
right to privacy. In
this sense, freedom is the origin both in
legislation and the genealogy of privacy
(Gusy, 2018).
For social media and online realms,
freedom and privacy are closely
connected. However,
the ideas of possessions and ownership
do not seem fully applicable anymore.
Borders and
possessions have become fuzzy
because, very often, data and personal
information are perceived
as shared goods as soon as they appear
online. Nissenbaum (2010)
summarized her work on
contextual integrity with the words:
“We have a right to privacy, but it is
neither a right to control
personal information nor a right to
have access to this information
restricted” (p. 231). She
conceded that for social media, privacy
is rather a value and a perception.
Borders and possessions have a
permanent nature. By contrast, values
and perception are
subject to interpretation and change.
This understanding of privacy as
subject to interpretation
and change makes it a matter of
communication. Eventually,
communication about privacy will
result in trust, and if successfully
shared in a society, in social and
legislative norms. However,
communication cannot be understood
as a long and painful process that will
finally show the
solution and lead us into the light of
privacy. Just the opposite is the case. In
social media,
communication about data sharing and
the use and flow of interpersonal data
is the solution itself.
Only due to ongoing and critical
assessment, reassessment, and dynamic
communication will we
have the chance to ensure privacy as
one of the most important values of
civilized societies.
THE SOCIAL MEDIA PRIVACY
MODEL 25
Privacy’s interdependence is expressed
and put forward by interpersonal
communication.
And in fact, privacy is experiencing a
“social turn”, in social media and
beyond (Helm &
Eichenhofer, 2019). This emerging
social turn in privacy theory is
acknowledged in the social
media privacy model. However, there
are downsides to a conception of
privacy as a
communicative process. First, not all
members of this communicative
process will have equal
chances of being heard. For example,
digital and literacy gaps have been
shown to crucially
influence participation in these
communication processes (Helsper,
2017).
Second, online information has become
a commodified good, and financial
interests are
stark (Sevignani, 2016). Purposeless
communication among friends is
increasingly exploited for
economic reasons (Seubert & Becker,
2019); and, companies do their best to
avoid interactions
between individual users who strive to
regulate their privacy. In turn, they try
to establish trust
through strong branding activities that
have been shown to override social
media users’ privacy
concerns and their interest in solving
privacy issues by actively
communicating and participating
(Boerman, Kruikemeier, & Zuiderveen
Borgesius, 2018; Li, 2011).
Third, as a consequence, the
requirements of communication and
trust demand a lot from
users. Control implies a settled state in
which the individual person can lie
back and stop thinking
about the flow of personal information
online. Communication, trust, and
norms, by contrast, are
subject to change and thus require
constant assessment and consideration.
Hence, information
control should also be considered with
regard to an individual’s ability to
exert control (Grimm &
Bräunlich, 2015). The users-centered
perspective needs to be complemented
and accompanied
with a system based perspective and
respective interventions (Schäwel,
2020).
Fourth, this demanding process of
communication might also result in a
threat to the
ability to develop a self-determined
identity. Westin (1967) held that
boundary control means
identity control: “This deliberate
penetration of the individual’s
protective shell, his
THE SOCIAL MEDIA PRIVACY
MODEL 26
psychological armor, would leave him
naked to ridicule and shame and would
put him under the
control of those who knew his secrets”
(p. 33). As a consequence, lost control
would mean
threats to identity development.
Finally, I embrace the demanding and
somewhat stressful nature of
communicating about
privacy in social media. In social
media, there is only limited control
over personal information.
In addition, the handling of this lack of
control is demanding and stressful. By
acknowledging
these two observations, users will
acknowledge that they need to take
action, engage in
communication, and establish trust and
shared social and legislative norms. In
social media,
privacy is not a private affair. It is at
the center of communication. We are
out of control because
we have so much to share. Hence,
interpersonal communication, trust,
and norms are the three
most important mechanisms that
interdependently help to ensure social
media privacy.
THE SOCIAL MEDIA PRIVACY
MODEL 27
References
Altman, I. (1975). The environment
and social behavior: Privacy, personal
space, territory,
crowding. Monterey, CA:
Brooks/Cole Publishing Company.
Anderson, J., Rainie, L., & Duggan,
M. (2014). Digital life in 2025.
Retrieved from
http://www.pewinternet.org/
2014/03/11/digital-life-in-2025/
Ben-Ze'ev, A. (2003). Privacy,
emotional closeness, and openness in
cyberspace. Computers
in Human Behavior, 19(4), 451–567.
https://doi.org/10.1016/S0747-
5632(02)00078-X
Boerman, S. C., Kruikemeier, S., &
Zuiderveen Borgesius, F. J. (2018).
Exploring
motivations for online privacy
protection behavior: Insights from
panel data.
Communication Research, 25.
https://doi.org/10.1177/009365021880
0915
Boyd, d. (2008). Taken out of context.
American teen sociality in networked
publics (Doctoral
dissertation). University of California,
Berkeley.
Boyd, d. (2014). It's complicated. The
social lives of networked teens. New
Haven, CT: Yale
University Press.
Brandimarte, L., Acquisti, A., &
Loewenstein, G. (2013). Misplaced
confidences: Privacy and
the control paradox. Social
psychological and personality science,
4(3), 340–347.
https://doi.org/
10.1177/1948550612455931
Brummett, E. A., & Steuber, K. R.
(2015). To reveal or conceal? Privacy
management
processes among interracial romantic
partners. Western Journal of
Communication, 79(1),
22–44.
https://doi.org/10.1080/10570314.201
4.943417
Burgoon, J. K. (1982). Privacy and
communication. Communication
Yearbook, 6(4), 206–
249. https://doi.org/10.1080/23808985
Burkhalter, S., Gastil, J., & Kelshaw,
T. (2002). A conceptual definition and
theoretical model
of public deliberation in small face-to-
face groups. Communication Theory,
12(4), 398–
422.
https://doi.org/10.1093/ct/12.4.398
Buss, A. (2001). Psychological
dimensions of the self. Thousand
Oaks, CA: Sage.
THE SOCIAL MEDIA PRIVACY
MODEL 28
Carr, C. T., & Hayes, R. A. (2015).
Social media: defining, developing,
and divining. Atlantic
Journal of Communication, 23(1), 46–
65.
https://doi.org/10.1080/15456870.201
5.972282
Cho, H., Lee, J.-S., & Chung, S.
(2010). Optimistic bias about online
privacy risks: Testing
the moderating effects of perceived
controllability and prior experience.
Computers in
Human Behavior, 26, 987–995.
https://doi.org/10.1016/j.chb.2010.02.
012
Crowley, J. L. (2017). A framework
of relational information control: A
review and extension
of information control research in
interpersonal contexts.
Communication Theory, 27(2),
202–222.
https://doi.org/10.1111/comt.12115
Derlega, V. J., Metts, S., Petronio, S.,
& Margulis, S. T. (1993). Self-
disclosure. Sage series
on close relationships. Newbury Park,
CA: Sage Publications.
Dienlin, T. (2014). The privacy
process model. In S. Garnett, S. Halft,
M. Herz, & J. M.
Mönig (Eds.), Medien und Privatheit
[Media and privacy] (pp. 105–122).
Passau,
Germany: Karl Stutz.
Eastin, M. S., Brinson, N. H., Doorey,
A., & Wilcox, G. (2016). Living in a
big data world:
predicting mobile commerce activity
through privacy concerns. Computers
in Human
Behavior, 58, 214–220.
https://doi.org/10.1016/j.chb.2015.12.
050
Eichenhofer, J. (2019). e-Privacy -
Theorie und Dogmatik eines
europäischen
Privatheitsschutzes im Internet-
Zeitalter [Theoretical and doctrinal
foundations of a
European privacy protection
regulation in the internet age].
Bielefeld: University of
Bielefeld.
Ellison, N. B., & boyd, d. (2013).
Sociality through social network sites.
In W. H. Dutton
(Ed.), The Oxford handbook of
Internet studies (pp. 151–172).
Oxford, UK: Oxford
University Press.
European Commission. (2015).
Special Eurobarometer 431: Data
protection. Brussels, BE.
Retrieved from
http://ec.europa.eu/public_opinion/arc
hives/ebs/ebs_431_en.pdf
THE SOCIAL MEDIA PRIVACY
MODEL 29
Evans, S. K., Pearce, K. E., Vitak, J.,
& Treem, J. W. (2017). Explicating
affordances: A
conceptual framework for
understanding affordances in
communication research. Journal
of Computer-Mediated
Communication, 22(1), 35–52.
https://doi.org/10.1111/jcc4.12180
Fox, J., & McEwan, B. (2017).
Distinguishing technologies for social
interaction: The
perceived social affordances of
communication channels scale.
Communication
Monographs, 84(3), 298–318.
https://doi.org/10.1080/03637751.201
7.1332418
Gibson, J. J. (2014). The ecological
approach to visual perception.
Psychology Press &
Routledge Classic Editions. Hoboken,
NJ: Taylor and Francis (Original work
published
1979).
Grimm, R., & Bräunlich, K. (2015).
Vertrauen und Privatheit [Trust and
privacy]. DuD
Datenschutz und Datensicherheit
[Data protection and data security], 5,
289–294.
Gusy, C. (2018). Datenschutz als
Privatheitsschutz oder Datenschutz
statt Privatheitsschutz?
[Data protection as privacy protection
or privacy protection as data
protection?].
Europäische Grundrechte Zeitschrift
[European Fundamental Rights
Journal], 45(9-12),
244–255.
Helm, P., & Eichenhofer, C. (2019).
Reflexionen zu einem social turn in
den privacy studies.
In C. Aldenhoff, L. Edeler, Hennig,
M., Kelsch, J., L. Raabe, & F. Sobala
(Eds.),
Digitalität und Privatheit [Digitality
and Privacy] (pp. 139–166). Bielefeld,
Germany:
transcript.
https://doi.org/10.14361/97838394466
14-009
Helsper, E. J. (2017). The social
relativity of digital exclusion:
Applying relative deprivation
theory to digital inequalities.
Communication Theory, 27(3), 223–
242.
https://doi.org/10.1111/comt.12110
Howard, P. N., & Parks, M. R. (2012).
Social media and political change:
Capacity,
constraint, and consequence. Journal
of Communication, 62(2), 359–362.
https://doi.org/10.1111/j.1460-
2466.2012.01626.x
THE SOCIAL MEDIA PRIVACY
MODEL 30
Johnson, C. A. (1974). Privacy as
personal control. In S. T. Margulis
(Ed.), Man-environment
interactions: Evaluations and
applications (pp. 83–100).
Stroudsburg, PA: Dowden,
Hutchinson & Ross.
Joinson, A. N. (2001). Self-disclosure
in computer-mediated
communication: The role of self-
awareness and visual anonymity.
European Journal of Social
Psychology, 31(2), 177–192.
https://doi.org/10.1002/ejsp.36
Laufer, R. S. [R. S.], & Wolfe, M.
(1977). Privacy as a concept and a
social issue: A
multidimensional developmental
theory. Journal of Social Issues, 33(3),
22–42.
https://doi.org/10.1111/j.1540-
4560.1977.tb01880.x
Li, Y. (2011). Empirical studies on
online information privacy concerns:
Literature review
and an integrative framework.
Communications of the Association
for Information Systems,
28(1), 453–496. Retrieved from
http://aisel.aisnet.org/cais/vol28/iss1/2
8
Madden, M. (2014). Public
perceptions of privacy and security in
the post-Snowden era.
Retrieved from
http://www.pewinternet.org/2014/11/1
2/public-privacy-perceptions/
Madden, M., & Rainie, L. (2015).
Americans’ attitudes about privacy,
security and
surveillance. Retrieved from
http://www.pewinternet.org/2015/05/2
0/americans-attitudes-
about-privacy-security-and-
surveillance/
Marwick, A. E., & boyd, d. (2014).
Networked privacy. How teenagers
negotiate context in
social media. New Media & Society,
16(7), 1051–1067.
https://doi.org/
10.1177/1461444814543995
Masur, P. K. (2019). Situational
privacy and self-disclosure:
Communication processes in
online environments. Cham,
Switzerland: Springer International
Publishing.
Metzger, M. J. (2004). Privacy, trust,
and disclosure: Exploring barriers to
electronic
commerce. Journal of Computer-
Mediated Communication, 9(4).
https://doi.org/10.1111/j.1083-
6101.2004.tb00292.x
THE SOCIAL MEDIA PRIVACY
MODEL 31
Moor, J. H. (1997). Towards a theory
of privacy in the information age.
ACM SIGCAS
Computers and Society, 27(3), 27–32.
https://doi.org/10.1145/270858.27086
6
Moore, B. (1984). Privacy: Studies in
social and cultural history. Armonk,
N.Y.: M.E.
Sharpe.
Nissenbaum, H. (2010). Privacy in
context: Technology, policy, and the
integrity of social
life. Palo Alto, CA: Stanford
University Press.
Ochs, C., & Büttner, B. (2018). Das
Internet als "Sauerstoff" und
"Bedrohung" [The internet
as oxygen and menace]. In M.
Friedewald (Ed.), DuD-Fachbeiträge.
Privatheit und
selbstbestimmtes Leben in der
digitalen Welt [Privacy and a self-
determined life in a
digital world] (pp. 33–80).
Wiesbaden, Germany: Springer
Vieweg.
Papacharissi, Z. (2010). A private
sphere: Democracy in a digital age.
Cambridge: Polity
Press.
Pedersen, D. M. (1999). Model for
types of privacy by privacy functions.
Journal of
Environmental Psychology, 19, 397–
405.
https://doi.org/10.1006/jevp.1999.014
0
Petronio, S. (2002). Boundaries of
privacy. Albany, NY: State University
of New York Press.
Qian, H., & Scott, C. R. (2007).
Anonymity and self-disclosure on
weblogs. Journal of
Computer-Mediated Communication,
12(4), 1428-1451.
https://doi.org/10.1111/j.1083-
6101.2007.00380.x
Quinn, K. (2014). An ecological
approach to privacy: “Doing” online
privacy at midlife.
Journal of Broadcasting & Electronic
Media, 58(4), 562–580.
https://doi.org/
10.1080/08838151.2014.966357
Quinn, K. (2016). Why we share: A
uses and gratifications approach to
privacy regulation in
social media use. Journal of
Broadcasting & Electronic Media,
60(1), 61–86.
https://doi.org/
10.1080/08838151.2015.1127245
THE SOCIAL MEDIA PRIVACY
MODEL 32
Rainie, L., Kiesler, S., Kang, R., &
Madden, M. (2013). Anonymity,
privacy, and security
Online. Retrieved from
http://www.pewinternet.org/2013/09/0
5/anonymity-privacy-and-
security-online/
Ramirez, A., Bryant, S., Erin, M.,
Fleuriet, C., & Cole, M. (2015). When
online dating
partners meet offline: The effect of
modality switching on relational
communication
between online daters. Journal of
Computer-Mediated Communication,
20(1), 99–114.
https://doi.org/10.1111/jcc4.12101
Saeri, A. K., Ogilvie, C., La Macchia,
S. T., Smith, J. R., & Louis, W. R.
(2014). Predicting
facebook users' online privacy
protection: Risk, trust, norm focus
theory, and the theory of
planned behavior. The Journal of
Social Psychology, 154(4), 352–369.
https://doi.org/
10.1080/00224545.2014.914881
Sarikakis, K., & Winter, L. (2017).
Social media users’ legal
consciousness about privacy.
Social Media + Society, 3(1), 1-14.
https://doi.org/10.1177/205630511769
5325
Schäwel, J. (2020). How to raise
users’ awareness of online privacy.
Duisburg, Germany:
University of Duisburg-Essen.
Schmidt, J.-H. (2014). Twitter and the
rise of personal publics. In K. Weller,
A. Bruns, J.
Burgess, M. Mahrt, & C. Puschmann
(Eds.), Digital formations: Vol. 89.
Twitter and
society (pp. 3–14). New York: Peter
Lang.
Seubert, S., & Becker, C. (2019). The
culture industry revisited:
Sociophilosophical
reflections on ‘privacy’ in the digital
age. Philosophy & Social Criticism,
45(8), 930–947.
https://doi.org/
10.1177/0191453719849719
Sevignani, S. (2016). Privacy and
capitalism in the age of social media.
Routledge research
in information technology and society:
Vol. 18. New York, NY: Routledge.
Slater, M. D. (2007). Reinforcing
spirals: The mutual influence of media
selectivity and
media effects and their impact on
individual behavior and social
identity. Communication
Theory, 17(3), 281–303.
https://doi.org/10.1111/j.1468-
2885.2007.00296.x
THE SOCIAL MEDIA PRIVACY
MODEL 33
Smith, H. J., Dinev, T., & Xu, H.
(2011). Information privacy research:
an interdisciplinary
review. Mis Quarterly, 35(4), 989–
1016.
Spottswood, E. L., & Hancock, J. T.
(2017). Should I share that?
Prompting social norms that
influence privacy behaviors on a
social networking site. Journal of
Computer-Mediated
Communication, 22(2), 26.
https://doi.org/10.1111/jcc4.12182
Taneja, A., Vitrano, J., & Gengo, N. J.
(2014). Rationality-based beliefs
affecting individual’s
attitude and intention to use privacy
controls on facebook: An empirical
investigation.
Computers in Human Behavior, 38,
159–173.
https://doi.org/10.1016/j.chb.2014.05.
027
Tavani, H. T. (2007). Philosophical
theories of privacy: Implications for
an adequate online
privacy policy. Metaphilosophy,
38(1), 1–22.
https://doi.org/10.1111/j.1467-
9973.2006.00474.x
Tavani, H. T., & Moor, J. H. (2001).
Privacy protection, control of
information, and privacy-
enhancing technologies. ACM
SIGCAS Computers and Society,
31(1), 6–11.
https://doi.org/
10.1145/572277.572278
Teutsch, D., Masur, P. K., & Trepte,
S. (2018). Privacy in mediated and
nonmediated
interpersonal communication: How
subjective concepts and situational
perceptions
influence behaviors. Social Media +
Society, 4(2), 1-14.
https://doi.org/
10.1177/2056305118767134
Treem, J. W., & Leonardi, P. M.
(2012). Social media use in
organizations. Exploring the
affordances of visibility, editability,
persistence, and association.
Communication
Yearbook, 36, 143–189.
https://doi.org/10.1080/23808985.201
3.11679130
Trepte, S., & Masur, P. K. (2017).
Need for privacy. In Zeigler-Hill, V.,
Shakelford, T. K.
(Ed.), Encyclopedia of personality and
individual differences. London, UK:
Springer.
https://doi.org/10.1007/978-3-319-
28099-8_540-1
Trepte, S., & Reinecke, L. (Eds.).
(2011). Privacy online. Perspectives
on privacy and self-
disclosure in the social web. Berlin,
Germany: Springer.
THE SOCIAL MEDIA PRIVACY
MODEL 34
Trepte, S., Reinecke, L., Ellison, N.
B., Quiring, O., Yao, M. Z., &
Ziegele, M. (2017). A
cross-cultural perspective on the
privacy calculus. Social Media +
Society, 3(1), 1-13.
https://doi.org/
10.1177/2056305116688035
Tsay-Vogel, M., Shanahan, J., &
Signorielli, N. (2018). Social media
cultivating perceptions
of privacy: A 5-year analysis of
privacy attitudes and self-disclosure
behaviors among
Facebook users. New Media &
Society, 20(1), 141–161.
https://doi.org/
10.1177/1461444816660731
Utz, S., & Krämer, N. (2009). The
privacy paradox on social network
sites revisited. The role
of individual characteristics and group
norms. Journal of Psychosocial
Research on
Cyberspace, 3(2). Retrieved from
http://cyberpsychology.eu/view.php?
cisloclanku=2009111001&article=2
Vitak, J. (2012). The impact of
context collapse and privacy on social
network site
disclosures. Journal of Broadcasting &
Electronic Media, 56(4), 451–470.
https://doi.org/
10.1080/08838151.2012.732140
Waldman, A. E. (2018). Privacy as
Trust. Cambridge, UK: Cambridge
University Press.
https://doi.org/
10.1017/9781316888667
Walther, J. B. (1996). Computer-
mediated communication. Impersonal,
interpersonal, and
hyperpersonal interaction.
Communication Research, 23(1), 3–
43.
https://doi.org/
10.1177/009365096023001001
Warren, S. D., & Brandeis, L. D.
(1890). The right to privacy. Harvard
Law Review, 4(5),
193–220.
Westin, A. F. (1967). Privacy and
freedom. New York, NY: Atheneum.
Wolfe, M., & Laufer, R. (1974). The
concept of privacy in childhood and
adolescence. In S.
T. Margulis (Ed.), Man-environment
interactions: Evaluations and
applications (pp. 29–
54). Stroudsburg, PA: Dowden,
Hutchinson & Ross.
Non Parametric Tests: Hands on SPSS
N. Uttam Singh, Aniruddha Roy & A. K. Tripathi
ICAR Research Complex for NEH Region, Umiam, Meghalaya
uttamba@gmail.com, aniruddhaubkv@gmail.com, aktripathi2020@yahoo.co.in

Chapter 1: Introduction
Which is more powerful (parametric and non-parametric tests)
Parametric Assumptions
Nonparametric Assumptions
Advantages of Nonparametric Tests
Disadvantages of nonparametric tests
Few important points on nonparametric test
Measurement
Parametric vs. non-parametric tests
Nonparametric Methods

Chapter2: Tests of relationships between variables


Chi-square Test
Binomial Test
Run Test for Randomness
One-Sample Kolmogorov-Smirnov Test

Chapter 3: Two-Independent-Samples Tests


Mann-Whitney U test
The two-sample Kolmogorov-Smirnov test
Wlad-Walfowitz Run
Mozes Extreme Reactions
Chapter 4: Multiple Independent Samples Tests
Median test
Kruskal-Wallis H
Jonckheere-terpstra test

Chapter 5: Tests for Two Related Samples


Wilcoxon signed-ranks
McNemar
Marginal-homogeinity
Sign test

Chapter 6: Tests for Multiple Related Samples


Friedman
Cochran’s Q
Kendall’s W

Chapter 7: Exact Tests and Monte Carlo Method


The Exact Method
The Monte Carlo Method
When to Use Exact Tests

Test Questions:

References:
They are called nonparametric because they make no assumptions about the parameters (such as the mean
and variance) of a distribution, nor do they assume that any particular distribution is being used.

Introduction
A parametric statistical test is one that makes assumptions about the parameters (defining properties) of the population
distribution(s) from which one's data are drawn.

A non-parametric test is one that makes no such assumptions. In this strict sense, "non-parametric" is essentially a null
category, since virtually all statistical tests assume one thing or another about the properties of the source population(s).

Which is more powerful?


Non-parametric statistical procedures are less powerful because they use less information in their calculation. For
example, a parametric correlation uses information about the mean and deviation from the mean while a non-parametric
correlation will use only the ordinal position of pairs of scores.

Parametric Assumptions
 The observations must be independent
 The observations must be drawn from normally distributed populations
 These populations must have the same variances
 The means of these normal and homoscedastic populations must be linear combinations of effects due to columns
and/or rows

Nonparametric Assumptions
Certain assumptions are associated with most nonparametric statistical tests, but these are fewer and weaker than
those of parametric tests.

Advantages of Nonparametric Tests


 Probability statements obtained from most nonparametric statistics are exact probabilities, regardless of the
shape of the population distribution from which the random sample was drawn
 If sample sizes as small as N=6 are used, there is no alternative to using a nonparametric test
 Easier to learn and apply than parametric tests
 Based on a model that specifies very general conditions.
 No specific form of the distribution from which the sample was drawn.
 Hence nonparametric tests are also known as distribution free tests.

Disadvantages of nonparametric tests


 Losing precision/wasteful of data
 Low power
 False sense of security
 Lack of software
 Testing distributions only
 Higher-ordered interactions not dealt with
 Parametric models are more efficient if data permit.
 It is difficult to compute by hand for large samples
 Tables are not widely available
 In cases where a parametric test would be appropriate, non-parametric tests have less power. In other words,
a larger sample size can be required to draw conclusions with the same degree of confidence.

Few points
 The inferences drawn from tests based on the parametric tests such as t, F and Chi-square may be seriously
affected when the parent population’s distribution is not normal.
 The adverse effect could be more when sample size is small.
 Thus when there is doubt about the distribution of the parent population, a nonparametric method should be
used.
 In many situations, particularly in social and behavioral sciences, observations are difficult or impossible to
take on numerical scales and a suitable nonparametric test is an alternative under such situations.
Measurement
The 4 levels of measurement
1. Nominal or Classificatory Scale
 Gender, ethnic background, colors of a spectrum
 In research activities a YES/NO scale is nominal. It has no order and there is no distance
between YES and NO.
2. Ordinal or Ranking Scale
 Hardness of rocks, beauty, military ranks
 The simplest ordinal scale is a ranking.
 There is no objective distance between any two points on your subjective scale.
3. Interval Scale
 Celsius or Fahrenheit. It is an interval scale because it is assumed to have equidistant
points between each of the scale elements.
4. Ratio Scale
 Kelvin temperature, speed, height, mass or weight
 Ratio data is interval data with a natural zero point
Parametric vs. non-parametric tests
Parametric Non-parametric

Assumed distribution Normal Any

Assumed variance Homogeneous Any

Typical data Ratio or Interval Ordinal or Nominal

Data set relationships Independent Any

Usual central measure Mean Median

Benefits Can draw more conclusions Simplicity; Less affected by outliers

Tests

Choosing Choosing parametric test Choosing a non-parametric test

Correlation test Pearson Spearman

Independent measures, 2 groups Independent-measures t-test Mann-Whitney test


One-way, independent-measures
Independent measures, >2 groups Kruskal-Wallis test
ANOVA

Repeated measures, 2 conditions Matched-pair t-test Wilcoxon test

Repeated measures, >2 conditions One-way, repeated measures ANOVA Friedman's test
Nonparametric Methods

There is at least one nonparametric test equivalent to a parametric test

Tests of relationships between variables

Chi-square Test
This goodness-of-fit test compares the observed and expected frequencies in each category to test either that all categories
contain the same proportion of values or that each category contains a user-specified proportion of values.

Examples
The chi-square test could be used to determine if a basket of fruit contains equal proportions of apples, bananas, oranges,
and peaches.
fruits count
orange 1
orange 1
mango 2
banana 3
lemon 4
banana 3
orange 1
lemon 4
lemon 4
orange 1
mango 2
banana 3
lemon 4
banana 3
orange 1
lemon 4
lemon 4
SPSS Steps:
Get the data.

Follow the steps as shown


Get the count in the test variable list

Click OK and get the output as shown below

Interpretation:
Here p value is 0.981 which is more than 0.05. Hence it is not significant and we fail to reject the null hypothesis and
conclude that there is no significant difference in the proportions of apples, bananas, oranges, and peaches.
We could also test to see if a basket of fruit contains 10% apples, 20% bananas, 50% oranges, and 20% peaches. For this
we have to define the proportions by checking the button “Values” and keep on adding.

Binomial Test
The Binomial Test procedure is useful when you want to compare a single sample from a dichotomous variable to an
expected proportion. If the dichotomy does not exist in the data as a variable, one can be dynamically created based upon
a cut point on a scale variable (take age as example from the data). If your variable has more than two outcomes, try the
Chi-Square Test procedure. If you want to compare two dichotomous variables, try the McNemar test in the Two-Related-
Samples Tests procedure.
Example
Say we wish to test whether the proportion of females from the variable “gender” differs significantly from 50%, i.e.,
from 0.5. We will use the exact statement to produce the exact p-values.

AgeMarital_Status Family_Size Land_Holding Achievement Market_Orientation Problem Gender

21 2 1 1 83 17 16 0
40 1 0 0 77 18 17 0
32 1 0 1 79 18 17 0
37 1 2 1 80 18 17 1
40 3 2 1 78 18 17 0
40 1 2 0 78 18 17 1
52 1 0 0 79 24 13 0
35 2 2 1 94 24 20 1
38 2 2 1 81 22 12 0
55 1 0 1 78 18 10 1
35 2 1 0 87 23 17 1
35 3 2 1 89 22 10 0
55 1 1 0 87 23 15 0
40 1 2 1 86 23 14 1
62 1 1 1 80 18 10 1
40 1 1 0 83 24 13 1
48 3 1 1 76 21 14 1
62 1 2 1 84 23 11 0
36 1 0 0 81 26 11 0
35 1 2 1 80 21 11 0
35 1 2 1 77 22 13 1
35 1 1 1 82 16 14 1
18 2 2 0 83 26 10 0

SPSS Steps:

Get the
data.
Follow the steps as shown below

Get the variable gender in the test variable list.


Click OK and get the output

Interpretation:
Since p value is 1 it is not significant and we fail to reject null hypothesis and conclude that the proportion of females
from the variable “gender” does not differ significantly from 50%.
Run Test for Randomness
Run test is used for examining whether or not a set of observations constitutes a random sample from an infinite
population. Test for randomness is of major importance because the assumption of randomness underlies statistical
inference. In addition, tests for randomness are important for time series analysis. Departure from randomness can take
many forms. The cut point is based either on a measure of central tendency (mean, median, or mode) or a custom value. A
sample with too many or too few runs suggests that the sample is not random.

Example
Let’s see whether the variable “AGE” in the dataset below is random.
Table: Cancer dataset
ID TRT AGE WEIGHIN STAGE TOTALCIN TOTALCW2 TOTALCW4 TOTALCW6
1 0 52 124 2 6 6 6 7
5 0 77 160 1 9 6 10 9
6 0 60 136.5 4 7 9 17 19
9 0 61 179.6 1 6 7 9 3
11 0 59 175.8 2 6 7 16 13
15 0 69 167.6 1 6 6 6 11
21 0 67 186 1 6 11 11 10
26 0 56 158 3 6 11 15 15
31 0 61 212.8 1 6 9 6 8
35 0 51 189 1 6 4 8 7
39 0 46 149 4 7 8 11 11
41 0 65 157 1 6 6 9 6
45 0 67 186 1 8 8 9 10
2 0 46 163.8 2 7 16 9 10
12 1 56 227.2 4 6 10 11 9
14 1 42 162.6 1 4 6 8 7
16 1 44 261.4 2 6 11 11 14
22 1 27 225.4 1 6 7 6 6
24 1 68 226 4 12 11 12 9
34 1 77 164 2 5 7 13 12
37 1 86 140 1 6 7 7 7
42 1 73 181.5 0 8 11 16
44 1 67 187 1 5 7 7 7
50 1 60 164 2 6 8 16
58 1 54 172.8 4 7 8 10 8
SPSS Steps:

Load the data.

Follow the following steps.


Select “AGE” in the test variables list.

This variable “AGE” must be divided into two spate groups. Therefore we must indicate a cut point. Now lets take
Median as the cut point. Any value blow the median point will belong to one group and any value greater than or equal to
median will belong to the other group. Now click OK to get output.

Interpretation:
Now p value is 0.450. So it is not significant and we cannot say that AGE is not random.

One-Sample Kolmogorov-Smirnov Test


The One-Sample Kolmogorov-Smirnov procedure is used to test the null hypothesis that a sample comes from a particular
distribution. Four theoretical distribution functions are available-- normal, uniform, Poisson, and exponential. If we want
to compare the distributions of two variables, use the two-sample Kolmogorov-Smirnov test in the Two-Independent-
Samples Tests procedure.

Example: Let us test the variable “AGE” in the cancer dataset used for Run test above is normal distribution or uniform
distribution.
SPSS Steps
Get the data as done before. Then…

Select “AGE” in the test variable list.

Check the distribution for which you want to test. Click OK and get the output.
Interpretation:
The p value is 0.997 which is not significant and therefore we cannot say that “AGE” does not have an approximate
normal distribution. If the p value were less than 0.05 we would say it is significant and AGE does not follow an
approximate normal distribution.

Two-Independent-Samples Tests
The nonparametric tests for two independent samples are useful for determining whether or not the values of a particular
variable differ between two groups. This is especially true when the assumptions of the t test are not met.
 Mann-Whitney U test: To test for differences between two groups
 The two-sample Kolmogorov-Smirnov test: To test the null hypothesis that two samples have the same
distribution
 Wlad-Walfowitz Run: Used to examine whether two random samples come from populations having same
distribution
 Mozes Extreme Reactions: Exact Test

Example: We want to find out whether the sales are different between two designs.

sales design store_size


11 1 1
17 1 2
16 1 3
14 1 4
15 1 5
12 2 1
10 2 2
15 2 3
19 2 4
11 2 5
23 3 1
20 3 2
18 3 3
17 3 4
27 4 1
33 4 2
22 4 3
26 4 4
28 4 5
SPSS Steps:
Open the dataset

Let’s compare between design 1 and 2.

Enter variable sales in test variable list and design in grouping variable.
Since we are performing two independent sample tests we have to designate which two groups in our factor design we
want to compare. So click “Define groups”.

Here we type group 2 and 1. Order is not important, only we have to enter two distinct groups. Then click continue and
OK to get output.

Interpretation:
Now two p values are displayed, asymptotic which is appropriate for large sample and exact which is independent of
sample size. Therefore we will take the exact p value i. e. 0.548 which is not significant and we conclude that there is no
significant difference in sales between the design group 1 and group 2.
Multiple Independent Samples Tests
The nonparametric tests for multiple independent samples are useful for determining whether or not the values of a
particular variable differ between two or more groups. This is especially true when the assumptions of ANOVA are not
met.
 Median test: This method tests the null hypothesis that two or more independent samples have the same
median. It assumes nothing about the distribution of the test variable, making it a good choice when you
suspect that the distribution varies by group
 Kruskal-Wallis H: This test is a one-way analysis of variance by ranks. It tests the null hypothesis that
multiple independent samples come from the same population.
 Jonckheere-terpstra test: Exact test

Example:
We want to find out whether the sales are different between the designs (Comparing more than two samples
simultaneously)

SPSS Steps:
Get the data in SPSS window as done before. Then…
Define range

Click continue then OK to get output.

Interpretation:
P value is 0.003 which is significant. Therefore we conclude that there is significant difference between the groups
(meaning- at least two groups are different)

Tests for Two Related Samples


The nonparametric tests for two related samples allow you to test for differences between paired scores when you cannot
(or would rather not) make the assumptions required by the paired-samples t test. Procedures are available for testing
nominal, ordinal, or scale variables.
 Wilcoxon signed-ranks: A nonparametric alternative to the paired-samples t test. The only assumptions
made by the Wilcoxon test are that the test variable is continuous and that the distribution of the difference
scores is reasonably symmetric.
 McNemar method tests the null hypothesis that binary responses are unchanged. As with the Wilcoxon test,
the data may be from a single sample measured twice or from two matched samples. The McNemar test is
particularly appropriate with nominal or ordinal test variables for binary data. Unlike the Wilcoxon test, the
McNemar test is designed for use with nominal or ordinal test variables.
 Marginal-homogeinity: If the varialbles are mortinomial i.e if they have more than two levels.
 Sign test: Wilkoxon and Sign are used for contineous data and of the two wilkoxon is more powerful
Example: Use the cancer data deployed in Run Test to test whether the condition of the cancer patient at the end of 2 nd
week and 4th week are significantly different. (here higher the reading, better is the condition)
Output:

Interpretation:
P value is 0.006 which is significant. This indicates that the condition of cancer patient at the end of 2 nd week and 4th week
are different.

Tests for Multiple Related Samples


The nonparametric tests for multiple related samples are useful alternatives to a repeated measures analysis of variance.
They are especially appropriate for small samples and can be used with nominal or ordinal test variables.
Friedman test is a nonparametric alternative to the repeated measures ANOVA. It tests the null hypothesis that
multiple ordinal responses come from the same population. As with the Wilcoxon test for two related samples, the
data may come from repeated measures of a single sample or from the same measure from multiple matched samples.
The only assumptions made by the Friedman test are that the test variables are at least ordinal and that their
distributions are reasonably similar.
Cochran’s Q: It tests the null hypothesis that multiple related proportions are the same. Think of the Cochran Q test
as an extension of the McNemar test used to assess change over two times or two matched samples. Unlike the
Friedman test, the Cochran test is designed for use with binary variables.
Kendall’s W: is a normalization of Friedman test and can be interpreted as a measure of agreement
SPSS steps:

Output
Interpretation:
P value is less than 0.05. Hence there is significant difference between the four groups (meaning- at least two groups are
different)
Exact Tests and Monte Carlo Method
These new methods, the exact and Monte Carlo methods, provide a powerful means for obtaining accurate results when
your data set is small, your tables are sparse or unbalanced, the data are not normally distributed, or the data fail to meet
any of the underlying assumptions necessary for reliable results using the standard asymptotic method.

The Exact Method


By default, IBM® SPSS® Statistics calculates significance levels for the statistics in the Crosstabs and Nonparametric
Tests procedures using the asymptotic method. This means that p values are estimated based on the assumption that the
data, given a sufficiently large sample size, conform to a particular distribution.

However, when the data set is small, sparse, contains many ties, is unbalanced, or is poorly distributed, the asymptotic
method may fail to produce reliable results. In these situations, it is preferable to calculate a significance level based on
the exact distribution of the test statistic. This enables you to obtain an accurate p value without relying on assumptions
that may not be met by your data.

The Monte Carlo Method


Although exact results are always reliable, some data sets are too large for the exact p value to be calculated, yet don’t
meet the assumptions necessary for the asymptotic method. In this situation, the Monte Carlo method provides an
unbiased estimate of the exact p value, without the requirements of the asymptotic method.
The Monte Carlo method is a repeated sampling method. For any observed table, there are many tables, each with the
same dimensions and column and row margins as the observed table. The Monte Carlo method repeatedly samples a
specified number of these possible tables in order to obtain an unbiased estimate of the true p value.

The Monte Carlo method is less computationally intensive than the exact method, so results can often be obtained more
quickly. However, if you have chosen the Monte Carlo method, but exact results can be calculated quickly for your data,
they will be provided.

When to Use Exact Tests


Calculating exact results can be computationally intensive, time-consuming, and can sometimes exceed the memory limits
of your machine. In general, exact tests can be performed quickly with sample sizes of less than 30. Table 1.1 provides a
guideline for the conditions under which exact results can be obtained quickly.
T

t
Q

NONPARAMETRIC TESTS

Eldho Varghese and Cini Varghese


Indian Agricultural Statistics Research Institute, New Delhi - 110 012

eldho@iasri.res.in, cini_v@iasri.res.in

IBM SPSS Exact Tests


Cyrus R. Mehta and Nitin R. Patel

IBM SPSS Statistics Base 20

You might also like