You are on page 1of 30

RESEARCH ARTICLE

PRIVACY CONCERNS AND DATA SHARING IN THE INTERNET OF


THINGS: MIXED METHODS EVIDENCE FROM CONNECTED CARS1
Patrick Cichy
Bern University of Applied Sciences, Institute for Applied Data Science and Finance,
Bern, SWITZERLAND {patrick.cichy@bfh.ch}

Torsten Oliver Salge


RWTH Aachen University, TIME Research Area, Institute for Technology and Innovation Management,
Aachen, GERMANY {salge@time.rwth-aachen.de}

Rajiv Kohli
Raymond A. Mason School of Business, William & Mary,
Williamsburg, VA, U.S.A. {rajiv.kohli@mason.wm.edu}

The Internet of Things (IoT) is increasingly transforming the way we work, live, and travel. IoT devices collect,
store, analyze, and act upon a continuous stream of data as a by-product of everyday use. However, IoT devices
need unrestricted data access to fully function. As such, they invade users’ virtual and physical space and raise
far-reaching privacy challenges that are unlike those examined in other contexts. As advanced IoT devices,
connected cars offer a unique setting to review and extend established theory and evidence on privacy and data
sharing. Employing a sequential mixed methods design, we conducted an interview study (n=120), a survey
study (n=333), and a field experiment (n=324) among car drivers to develop and validate a contextualized
model of individuals’ data sharing decisions. Our findings from the three studies highlight the interplay between
virtual and physical risks in shaping drivers’ privacy concerns and data sharing decisions—with information
privacy and data security emerging as discrete yet closely interrelated concepts. Our findings also highlight the
importance of psychological ownership, conceptualized as drivers’ feelings of possession toward their driving
data, as an important addition to established privacy calculus models of data sharing. This novel perspective
explains why individuals are reluctant to share even low-sensitivity data that do not raise privacy concerns. The
psychological ownership perspective has implications for designing incentives for data-enabled services in ways
that augment drivers’ self-efficacy and psychological ownership and thereby encourage them to share driving
data. These insights help reconcile a fundamental tension among IoT users—how to avail the benefits of data-
enabled IoT devices while reducing the psychological costs associated with the sharing of personal data.
Keywords: Privacy, privacy concerns, cybersecurity, psychological ownership, data sharing, data
disclosure, Internet of Things, IoT, connected car, smart products

“[Smart] doorbells and … cams … seemed like a good idea, Introduction


until hackers, stalkers and police tried to get their hands on the
video feed. … Applied to a car, the questions multiply. Can you The Internet of Things (IoT) is increasingly pervading and
just peer in on your teen driver—or spouse? Do I have to share transforming the way we work, live, and travel. Consumer
my footage with the authorities? Should my car be allowed to IoT is reshaping fields as diverse as fitness, housekeeping,
kick me off the road if it thinks I’m sleepy? How long until home automation and security, and personal mobility. IoT
insurance companies offer ‘discounts’ for direct video access?” devices such as fitness trackers, robot vacuum cleaners,
(G. A. Fowler in The Washington Post, February 27, 2020, smart meters, smart cameras, and connected cars have
emphasis added).

1
Susan Brown was the accepting senior editor for this paper. Saonee Sarker served as the associate editor.

DOI:10.25300/MISQ/2021/14165 MIS Quarterly Vol. 45 No. 4 pp. 1863-1892 / December 2021 1863
Cichy et al. / Privacy Concerns and Data Sharing in the Internet of Things

become constant companions that promise automation, In response, we adopt a privacy perspective relating to data
personalization, and comfort for everyday activities sharing to explore the rapidly expanding context of
(Beverungen et al. 2019; Raff et al. 2020). consumer IoT in general and connected cars in particular.
Employing a sequential mixed methods design (Venkatesh
This value proposition, however, relies on IoT devices et al. 2013; 2016), we dive deep into car drivers’ privacy
collecting, storing, analyzing, and acting upon a continuous assessments and their data sharing decisions. We seek to (1)
stream of data generated by increasingly manifold and uncover the key determinants of individuals’ data sharing
powerful sensors. As a result, data streams are collected as an decisions with a focus on privacy concerns and
inevitable by-product of IoT devices’ everyday usage—even psychological ownership, that is, feelings of ownership
though not all users are fully aware of the actual scale and toward their driving data; (2) explicate the interplay of those
scope of such practices. In fact, IoT devices can capture determinants in the presence and absence of data-enabled
important information about the device itself, its user, services as incentives; and (3) explore how to incentivize
proximal or distant others, as well as the local context. When IoT users to share data for service innovation.
analyzed jointly, IoT data can reveal individuals’ behavior,
preferences, and personality with potentially negative The connected car is an ideal IoT exemplar for our purposes,
consequences for them. Data from connected cars, for as it exhibits high levels of all four distinguishing properties
instance, can provide insights into drivers’ attention, risk- of IoT devices. Car manufacturers and other players in the
taking behavior, and mobility patterns—all of which can be of growing connected car ecosystem (e.g., insurers, car rental
considerable interest for actors in the broader connected car companies, car repair shops, and providers of navigation and
ecosystem including manufacturers, insurers, and regulators. infotainment services) have recognized car data as an
Hence, data privacy and security concerns are likely to impact essential source of future value creation and service
users’ IoT adoption and data sharing decisions. As such, IoT innovation (McKinsey, 2016)—and are jointly transforming
devices pose a novel, almost paradoxical challenge to users: the connected car into a “rolling self-surveillance machine”
The core value proposition of IoT devices is inexorably linked (The Washington Post, 2020). The connected car is not only
to the unrestricted sharing of user data. Yet sharing of that data an important technological stepping stone toward
in the absence of adequate privacy controls exposes the user autonomous driving, but also a window into a possible future
to substantial virtual and physical risks. Extant privacy studies of consumer IoT more broadly. It provides a unique
conducted in contexts as diverse as online shopping (e.g., opportunity to engage in contextualized theorizing (Hong et
Dinev and Hart 2006), location-based services (e.g., Crossler al. 2014) and to review and extend established theory and
and Bélanger, 2019), electronic health records (e.g., Anderson evidence on privacy and data sharing in light of the emerging
and Agarwal 2011), and online social networks (e.g., realities of the IoT. After all, in the history of privacy
Krasnova et al. 2010) consistently point to the importance of research, technological advances have played a vital role in
privacy concerns in shaping individual data sharing decisions pushing the boundaries of and stimulating theory
and highlight the role of data sensitivity, trusting beliefs, and development (Smith et al. 2011).
personalization benefits. However, none of these previous
Given the contextual novelty of our research, we adopted a
studies have explored privacy concerns and data sharing in the
sequential mixed methods design to guide the interplay of
emerging IoT context with its unique privacy challenges.
our studies (Venkatesh et al. 2013). Below, we describe three
consecutive studies—arranged in a way so that the findings
But is the emerging IoT context sufficiently distinct to raise of one study can inform the next—which allowed us to
new privacy challenges, limit the applicability of established develop and validate a model of individuals’ data sharing
theory and evidence, and warrant dedicated research decisions. First, short, semistructured interviews (Study 1)
attention? Four key properties of IoT set it apart from the provided rich insights into drivers’ data sharing decisions
previously examined contexts. IoT devices (1) tend to be and helped identify salient constructs. Second, our survey
“always on” and generate continuous data streams, (2) give (Study 2) formally tested and supported drivers’ privacy
users little or no power to control the data flows, (3) require concerns as mediating the relationship between
unrestricted data access to fully function, and (4) invade psychological ownership of their driving data and their data
users’ virtual and physical space given the increasingly sharing intentions. Third, our field study (Study 3) employed
powerful actuators—components that transform electric a plug-in telematics device and associated app to turn
impulses into physical actions—they are equipped with. ordinary cars into connected cars in order to move from
These four properties alter the very nature of both the drivers’ intentions to their actual data disclosure decisions
privacy risks that IoT users are exposed to and those users’ and associated contingency factors (relational trust, security
decision-making with regard to data sharing. beliefs, data sensitivity, and incentive design).

1864 MIS Quarterly Vol. 45 No. 4 / December 2021


Cichy et al. / Privacy Concerns and Data Sharing in the Internet of Things

Conceptual Background and what it is to be used for (e.g., Anderson and Agarwal
2010; Malhotra et al. 2004). With the rise of e-commerce, for
example, consumers have been asked to share their names and
Privacy and Individuals’ Data Sharing
addresses with online vendors (Dinev and Hart 2006). In
Decisions addition, they have had to come to terms with the fact that their
search and buying patterns are tracked by online vendors who
Privacy is commonly defined as the ability to control increasingly personalize websites to their customers’
information about oneself (Stone et al. 1983).2 Empirical individual product interests (Xiao and Benbasat 2007).
studies seeking to understand individuals’ decisions about the Similarly, the digitization of medical data promised to
maintenance or loss of this ability have traditionally relied on enhance the quality and efficiency of care (Agarwal et al.
the privacy calculus as the dominant analytical framework 2010) yet continues to rely on patients’ willingness to share
(e.g., Jiang et al. 2013; Xu et al. 2009). This perspective their personal health data with healthcare providers,
considers opposing forces that jointly influence individuals in pharmaceutical companies, and public health agencies
their decisions regarding whether to reveal or to conceal (Anderson and Agarwal 2011). As far as the Internet of Things
personal information (Culnan 1993). Individuals are assumed is concerned, Sheng et al. (2008) were among the first to shape
to weigh the positive and negative consequences of any our understanding of privacy implications associated with the
particular disclosure of information—i.e., to perform a cost- vision of ubiquitous computing. They predicted privacy
benefit analysis (Culnan 1993; Dinev and Hart 2006) and to concerns to emerge because enabling technologies like RFID
act in a way that will result in the most favorable net outcome and sensor networks collect data in a continuous fashion and
(Stone et al. 1983). This rationale is in line with theories of often without individuals’ active consent (e.g., in case of
human decision-making, such as utility maximization theory RFID-enabled carts in supermarkets that continuously track
(Awad and Krishnan 2006), expectancy theory of motivation and report customers’ location). Interestingly, such tracking is
(Stone and Stone 1990) and expectancy-value theory (Ajzen commonplace for many of today’s IoT devices, which creates
1991), which serve as a psychological foundation of the new opportunities and challenges for research on users’
privacy calculus (Li 2012). information privacy (Lowry et al. 2017).
While the perceived benefits of information sharing might In retrospect, it becomes apparent that technological
take the form of financial rewards (Hui et al. 2007) or innovation is one of the main triggers for new privacy studies.
personalization (Awad and Krishnan 2006), the perceived Novel technologies have often stimulated privacy research
costs are assumed to originate first and foremost from the and catalyzed theory development. The privacy implications
endangered privacy of individuals (i.e., the partial loss of the of IoT devices and the vision of a fully connected, smart world
ability to control personal information), manifested in their they are associated with, have, however, not yet been
privacy concerns (Malhotra et al. 2004; Smith et al. 1996). adequately examined (see Yun et al. 2018). Given the rapid
Privacy concerns, viewed as a measurable proxy for privacy diffusion of IoT devices across many areas of life, ranging
itself, constitute the central construct in empirical studies on from fitness to housing to mobility, there is now a pressing
privacy-related behavior. They have been identified as the need for research on privacy and data sharing in the context of
most influential determinant of individuals’ willingness to the IoT (Lowry et al. 2017).
share personal information (Malhotra et al. 2004; Stewart and
Segars 2002). In line with the privacy calculus perspective on
sharing decisions, Dinev and Hart (2006) observed that in
most cases, individuals do not strive to retain absolute privacy, Privacy in the IoT Context
but are willing to partially sacrifice it for certain benefits that
arise from data sharing. The increasingly ubiquitous and expanding class of
consumer IoT possesses several distinct properties with
Privacy is said to be a highly context-sensitive concept important privacy implications. First, IoT devices have
(Altman 1975; Margulis 2003; Westin 1967), meaning that the previously unseen abilities to sense their environment (e.g.,
perceived costs and benefits individuals have to trade off the shape, position, and movement of nearby objects, as well
against each other can differ significantly across situations. as contextual factors such as air temperature and air quality)
With this in mind, scholars have examined how privacy- and their inner workings (e.g., mechanical wear and tear and
related behavior interacts with contextual factors, such as what usage patterns). Depending on the number and types of
type of information is being requested, who is requesting it, sensors they are equipped with, data collection by IoT
devices is characterized not only by its volume but also by

2
We use the term “privacy” as a reference to “information privacy”
throughout the paper.

MIS Quarterly Vol. 45 No. 4 / December 2021 1865


Cichy et al. / Privacy Concerns and Data Sharing in the Internet of Things

its specificity and continuity. Indeed, most IoT devices, such shipped with. IoT device users hence often have to live with
as wearables, smart home appliances, and connected cars, whatever level of transparency and control over personal
collect a continuous stream of data when in use (Oberländer data is granted by the manufacturer.
et al. 2018). The decisive factor that determines their level
of intrusion into individuals’ informational space is Third, the value proposition of IoT devices is inherently
determined by the range and activity of sensors embedded in connected to the unconstrained collection, integration, and
a particular IoT device, which clearly differ across the analysis of data. A robot vacuum cleaner, for example,
various fields of consumer IoT, as illustrated in Table 1. would simply be unable to operate without a camera and
Equipped with probably the most comprehensive sensors in computer vision algorithms that allow it to find its way
consumer IoT, a connected car captures more diverse and around furniture and other obstacles. Without creating real-
more revealing information about its driver than, for time indoor maps, it would be unable to document the
example, a smart running shoe does about its wearer. The cleaning process and outcome. Similarly, the core
most valuable insights about individuals often emerge not functionalities and services of connected cars, such as
from single points of data but from continuous streams over remote car control, predictive car maintenance, driving style
time (Shim et al. 2020). Acceleration and braking data, for assistants, and pay-how-you-drive insurance schemes are
example, provide insights into individuals’ general driving reliant on unrestricted access to car data (Seiberth and
style, their mood, and their attitude toward the environment. Gruendinger 2018). This inherent reliance on car data is
The frequency of refueling and of tire pressure adjustments, likely to sharply increase over the next decade, during which
in turn, may serve as indicators of drivers’ general safety time vast amounts of driving data will be needed to train and
needs. Recurring patterns in the routes traveled reveal not validate autonomous driving algorithms (Eady 2019). For
only frequently visited places, but also how much time the consumer IoT in general, the functionality available to the
driver spent there. It might be possible to infer from personal user sharply degrades in the presence of data constraints. In
driving data where a person works, what restaurants are other words, sharing personal data is a necessary
preferred, or what leisure activities the driver engages in, precondition for customer value creation in consumer IoT.
with the accuracy of such inferences increasing with the
amount of available data (De Montjoye et al. 2015). Finally, IoT devices and their increasingly powerful actuators3
go beyond previous technologies in that they invade not only
Second, users of IoT devices are often left in the dark as to users’ virtual space, but also their physical space. This adds a
how precisely their data is being collected, integrated, and new dimension to the consequences of privacy-related
analyzed. This characteristic is observable across all fields decisions that users of IoT devices need to include in their
of consumer IoT and may be partly due to the fact that IoT calculus (Dinev and Hart 2006). Outside the IoT context,
devices—in contrast to personal computers—are equipped privacy risks stem primarily from unsolicited outflows of user
with tiny screens or else entirely lack components that information in combination with the exploitation of that
would allow them to display rich information. Users instead information in undesired ways. Examples include individuals’
must rely on a second device, such as a smartphone, or a online purchase histories, which are sold to third-party
smart speaker, to interact with the IoT device. Adapted to marketers, social media profiles that are considered during the
these circumstances, IoT devices’ user interfaces tend to selection of job applicants, and personal passwords that end
offer only limited information and control options regarding up in the hands of cybercriminals. Within the IoT context, in
personal data flows (Bahirat et al. 2018). Moreover, contrast, privacy risks result not only from unsolicited
associated software components, for example a smartphone outflows but also from unsolicited inflows of data. Consider
app that enables the steering and monitoring of the IoT how faulty signals from sensors or the insertion of malicious
device, often cannot be replaced or supplemented with more data from the outside may induce general malfunctioning or
privacy-friendly versions. While users have access to even threatening behaviors from an IoT device. The severity
various alternatives for software running on their general- of such threats relates primarily to the IoT device’s physical
purpose computing machines and can select the software impact (i.e., its ability to manipulate the environment and the
that best meets their privacy preferences (e.g., various amount of mass it can set in motion) and its physical proximity
messenger and navigation apps for smartphones), IoT (i.e., the extent to which it can invade the physical space of an
devices are often one-purpose computing machines and are, individual or open it up for intrusion by others).
in most cases, inherently linked to the firmware they are

3
An actuator is a device component (often a type of motor) that allows the detect obstacles and send a signal to the processing unit, which in turn triggers
device—or a part of it—to move or to control some other mechanism in the electric motors on each wheel (the actuators) to respond in such a way that
response to a signal received. Actuators have been described as the “muscle” a collision with the obstacle is avoided.
of IoT. In the case of a smart vacuum cleaner, for instance, infrared sensors

1866 MIS Quarterly Vol. 45 No. 4 / December 2021


Cichy et al. / Privacy Concerns and Data Sharing in the Internet of Things

Table 1. Selected Fields of Consumer IoT


Field of Exemplary Ability to sense (SENSORS) Ability to act
application products (ACTUATORS)
Primary data types Continuity Variety of Physical Physical
of data data stream impact proximity
stream
Sports & Wearables like Location/routes travelled, vital signs, low medium low high
fitness running shoes and training intensity and success
fitness trackers
House- Intelligent fridge, Operational data (when and how long low/medium low medium low
keeping and robot vacuum in use, chosen temperatures), content
appliances cleaner (fridge), room data (vacuum robot)
Building Smart meters Water and electricity consumption, high medium medium medium
automation (water and room temperature and heating
heating), blinds system operations, times of darkening
and lights and lighting of specific rooms
Home Door locks, fire & Motion in and around house, heat, high low low high
security smoke detectors, smoke and indoor air quality, door
motion detectors lock position
Personal Connected cars Location/routes traveled, driving medium/ high high high
mobility style and road conditions, high
technical conditions of vehicle

As illustrated in Table 1, the connected car ranks high in both Once again, discontinuous technological developments are set
of these categories and is hence associated with the most to challenge and extend existing theory on privacy and data
severe data-induced threats to individuals’ physical safety of sharing decisions. To help privacy research engage with the
all consumer IoT technologies present today. Interestingly, emerging realities of the IoT, we explore the specific context of
such threats do not need to be induced by opportunistic, connected cars and the specific data types that these novel IT
negligent, or malicious human action, but can result from the artifacts collect. This is an ideal context for our purposes, given
behavior of the autonomously acting IoT artifact itself, the presence in connected cars of all four distinguishing
which becomes a critical data-handling actor. properties of IoT devices, which will be compounded as we
advance toward the vision of autonomous driving.
In summary, the IoT radically changes the nature of the
privacy threats that users are exposed to because of several
factors: (1) the unprecedented volume, specificity, and Mixed Methods Research Design
continuity of data streams collected by IoT devices—with
many IoT devices being “always on”; (2) the opacity of data Mixed methods research generates rich insights by combining
collection, integration, and analysis, coupled with the fact qualitative and quantitative methods in the same inquiry. Mixed
that the user has little or no power to control data flows; (3) methods allow for deeper and more robust analyses of a
the degradation of functionality in the presence of data phenomenon by leveraging the complementary strengths of both
constraints; and (4) the ability of IoT devices to invade users’ approaches (Johnson and Onwuegbuzie 2004). While context-
virtual and physical spaces. Factors (2) and (3) are mostly sensitive research is depicted as a promising endeavor to
manufacturer-specific, irrespective of the particular type of advance theory in general (Hong et al. 2014; Johns 2006),
consumer IoT. For example, among electronics Venkatesh et al. (2013) suggest that mixed methods research is
manufacturers, Apple incorporates more privacy-friendly particularly well-suited to examine IS-related phenomena that
defaults and more granular privacy controls than Amazon closely interact with context, i.e., with the IT artifact. This is
does regarding products of the same type, e.g., smart especially true when existing findings are fragmented,
speakers (Kelly 2019). Factors (1) and (4), in contrast, are inconclusive, or equivocal (Venkatesh et al. 2016). Mixed
specific to the type of consumer IoT. This makes it possible methods research is not only frequently called for (Ågerfalk
to arrange IoT devices like smart shoes, smart fridges, and 2013; Venkatesh et al. 2013) but also increasingly adopted in
connected cars on a continuum from low to high according research practice—with IS taking a leading role (e.g., Srivastava
to their ability to sense and to (physically) act (see Table 1). and Chandra 2018; Wunderlich et al. 2019).

MIS Quarterly Vol. 45 No. 4 / December 2021 1867


Cichy et al. / Privacy Concerns and Data Sharing in the Internet of Things

STUDY 1 STUDY 2&3

Why are individuals


How do the identified
reluctant to share
determinants and How can individuals’ data
personal driving data with
introduced incentives sharing decisions in IoT-
other parties and what
interrelate in shaping related contexts be
determinants play a role
individuals’ decisions to conceptualized and what
in individuals’ data
share personal driving managerial opportunities
sharing decision in the
data? exist that could enhance data
context of connected
cars? sharing for service
Survey Field innovation?
(n=333) experiment
Interviews
(n=120) (n=324)

t
Qualitative research Quantitative research Mixed-Methods research
question question question

Note: Findings of Study 2 were supported by two replication studies (n1=131/ n2=730); research design of Study 3 was informed by survey-
based pilot (n=346)

Figure 1. Mixed Methods Approach and Research Questions

The execution and interplay of qualitative and quantitative Study 1: Data Sharing in the Context of
research in mixed methods designs can be manifold and Connected Cars
should be guided by the objectives of the particular research
endeavor. The overall objective of our work was to derive a
Methods
holistic understanding of individuals’ data sharing decisions
in IoT contexts, as exemplified by the case of connected cars. Design: In Study 1, we explored the determinants that shape
Toward this goal, and following the methodological individuals’ sharing decisions in the context of connected
guidelines put forward by Venkatesh et al. (2013, 2016), we cars. In particular, our objective was to discover the origins,
conducted three consecutive studies such that the findings of manifestations, and consequences of privacy concerns in this
one study could inform the next. While an inductive specific IoT context and to identify additional factors that
qualitative interview study (Study 1) helped contextualize our influence individuals in their decision to share driving data
theorizing, two subsequent deductive quantitative studies— with third parties. Given the novelty of connected cars and the
one a survey (Study 2) and the other a field study (Study 3)— unique privacy challenges associated with them, we employed
provided validation for our emerging contextualized model. an inductive qualitative approach using interviews (Charmaz
This is in line with Venkatesh et al.’s (2013, p. 25) proposal 2006).
that “interviews, a qualitative data collection approach, can
provide depth in a research inquiry by allowing researchers to Procedure: We recruited a sample of 120 car drivers in
gain deep insights from rich narratives, and surveys, a Germany for our interview study (age: M = 29.2 years, SD =
quantitative data collection approach, can bring breadth to a 13.8; gender: 47% female; higher education = 34%; active use
study by helping researchers to gather data about different of online social networks = 68%). We ensured that
aspects of a phenomenon from many participants.” The participants represented a broad range of personal mobility
overall purpose of our sequential mixed methods research habits and driving experiences. On average, recruited
design could be depicted as being primarily developmental participants covered a distance of 10,438 km/year (SD =
with an element of confirmation and completeness according 9,430; Min = 800; Max = 55,000) and 28% of them had done
to Venkatesh et al.’s classification. To further enhance the so with a car that they said they did not share with any other
richness and practical relevance of our work, we followed the person. Following Flick’s (2014) guidelines, we used short
guidelines for context-specific theorizing put forward by briefings as introductions to the interview. These were meant
Hong et al. (2014) in executing our quantitative studies, which to provide participants with information on connected cars,
included, among other things, formulating context-sensitive including indications of their potential to collect, process, and
versions of our measurement instruments. Figure 1 illustrates transmit driving data to enable a broad array of novel services
the three studies and their interplay. without framing this in either a positive or a negative way. The

1868 MIS Quarterly Vol. 45 No. 4 / December 2021


Cichy et al. / Privacy Concerns and Data Sharing in the Internet of Things

introduction ensured that participants had a basic really does with this data. I don’t know if they keep their
understanding of the technologies and applications of the promises.” Another participant commented: “Let’s say I
connected car. Brief, semistructured interviews (length: M = transmit the data to my car manufacturer, for example. In this
5.4 minutes, SD = 1.9) allowed participants to elaborate on case, I actually do not really know what will happen to it. I
their beliefs about and associations with the connected car, as can’t control it anymore. This somehow scares me.”
well as their reasons for or against sharing driving data. At the Participants came up with various negative consequences
end of the interview, study participants completed a short originating from a loss of their ability to control personal
questionnaire on their personal characteristics and their driving data. Table 2 lists these negative consequences in the
mobility habits. All interviews were conducted in person, order of their prevalence in the interview material.
digitally recorded, fully transcribed, and analyzed for content.
The transcribed interview material amounted to a total of All negative consequences revealed during our interviews can
31,946 words or 114 typewritten pages. be assigned to one or more of the privacy concern dimensions
proposed by Smith et al. (1996). While the basic
Analysis: Given our goal to arrive at a better understanding of dimensionality of individuals’ privacy concerns appears to
privacy concerns and data sharing decisions in the emerging hold across technological contexts, our results underline the
IoT context of connected cars, we employed an inductive need for contextualizing the specific content of these
approach in interpreting our interview material, similar to concerns. First, similar to previous findings on individuals’
extant qualitative studies in privacy research (e.g., Karwatzki fears when providing access to personal information
et al. 2017a; Krasnova et al. 2009, Parks et al. 2017). In the (Karwatzki et al. 2017a), data sharing in connected cars is
first round, we used open coding to label the emotional associated with the risk of financial harm (e.g., Items 2, 5, 10,
reactions, associations, and opinions that the participants and 14 in Table 2), social harm (e.g., Items 6, 12, and 15 in
expressed. The first author and an experienced research Table 2), and psychological harm (e.g., Items 6, 11, and 13 in
assistant did so independently, using both in vivo codes and Table 2) to the sharing individual (Moon 2000). However, the
provisional category names (Charmaz 2006). In an iterative manifestation of data-induced threats to users’ physical safety
process, we then refined, merged, and extended initially (e.g., Items 3 and 7 in Table 2) distinguishes the IoT context
assigned codes into conceptual categories. This step was of connected cars from other areas in which privacy is a
characterized by moving back and forth between data, codes, concern.
and relevant literature, and was accompanied by ongoing
discussions among the three authors (Charmaz 2006). The Personal driving data might reveal not only where individuals
coding procedure was supported by ATLAS.ti software. As an live and work, but also their daily schedules and routines. One
additional validation step, and to derive reliable quantitative participant stated, “The problem with the collection of driving
statements regarding the perception of negative consequences data is, really, that they see that from Monday until Friday I
of sharing driving data, a previously uninvolved research leave home at 7 a.m. and return at 6 p.m. And there might be
assistant reanalyzed this particular category of codes, using a someone who takes advantage of the fact that nobody is at
coding scheme and having no knowledge of the previously home … and robs your house.” He went on, “They can exactly
assigned labels (Miles and Huberman 1994). Results were see where I am and when I am coming back … [and] they
compared and differences in the assigned codes were resolved know if I’m in the car with buddies from football or if my child
by discussion among the authors. is with me.” The presence of a child can indeed be detected by
the car through sensors of the seat belt reminder system—a
standard feature in modern cars. As noted above, in the
Findings and Discussion context of connected cars and IoT devices in general, it is
usually not the individual data point that reveals potentially
Privacy concerns and negative consequences associated sensitive information about users and their habits, but the
with individuals sharing driving data: We observed that inferences made based on recurring patterns in data from a
interviewees viewed the adoption of connected cars and the variety of sources and from multiple points in time.
subsequent sharing of driving data with other parties as a
potential threat to their privacy. 41% of them articulated The individual user typically has little choice in regulating the
concerns over the loss of their privacy, with reactions ranging content or the quantity of data streaming from the connected
from a slight sense of discomfort to serious fears. Answers car, unless the user chooses to discontinue connectivity
indicated that sharing of personal driving data with the car services altogether, accepting a drastic degradation in
manufacturer and other service providers creates a sense of functionality. For instance, only through sharing wear-and-
losing the ability to control such data. One participant said: tear data with the workshop can drivers benefit from
“There is no way for me to verify what my car manufacturer predictive maintenance services.

MIS Quarterly Vol. 45 No. 4 / December 2021 1869


Cichy et al. / Privacy Concerns and Data Sharing in the Internet of Things

Table 2. Commonly Stated Negative Consequences Associated with Data Sharing in the Context of
Connected Cars
Privacy concern
No. Description Prevalence
dimension
(Mis)use of shared driving data for unexpected purposes by the requesting
1 High Secondary use
stakeholder or resale to other companies
Threat of prosecution, fines or loss of driving license in case of recorded
2 High Secondary use
misbehavior (data transmission to the police and to regulatory authorities)
Vulnerabilities that emerge from being monitored in daily routines (e.g.,
3 burglars might find out when no one is at home or know that car is usually not High Collection
locked when parked in one’s private garage)
Unauthorized
4 Theft and manipulation of driving data (data leaks, hacker attacks) Medium
access
Denial of service or increased costs of car insurance or car rental (e.g., for Collection;
5 Medium
risky driving style) and liability issues in case of self-inflicted car accidents secondary use
6 Feelings of surveillance and the move toward a fully transparent society Medium Collection
Unauthorized
7 Manipulation of vehicle functions and car hijacking through remote access Medium
access; other
Increase of unsolicited advertisement and reward program offerings
8 Low Secondary use
(especially by car dealers and service garages)
Incorrect assignment of data to a particular driver and incorrect inferences
9 from driving data as a result (e.g., in the case of a company car or when a car Low Errors
is shared among family members)
Loss of warranty services for the vehicle (e.g., when improper handling is
10 Low Secondary use
recorded)
Feeling of being overwhelmed by the complexity of data flows of connected
11 Low Collection
cars
12 Disadvantages when applying for or performing driving jobs Low Secondary use
Identification as a potentially poor driver (feelings of embarrassment, social
13 Low Collection
stigma)
Unauthorized
Exposure to unfair practices to stimulate spending on car maintenance and
14 Low access; error;
repair (digital manipulations by car manufacturer and workshops)
other
Unintended support of optimized positioning of radar traps (based on
15 Low Secondary use
information of how fast cars drive on certain streets)
Note: Low = 10-19 responses, Medium = 20-44 responses, High = more than 45 responses

Similarly, only through sharing data on driving style with a Study participants also imagined direct threats to their vehicles
car insurance company can drivers obtain discounts for safe and to their physical safety when driving a connected car (Item
driving. Because of this fact, some participants seem to relate 7 in Table 2). One participant mentioned that he was concerned
the adoption of connected cars to the inevitable sharing of that the manipulation of driving data might cause the engine to
driving data with other parties, for example, “I would never overheat. Another stated that she was afraid that someone could
get into one of those [connected] cars…. My data belongs to hack into the system and remotely apply or lock the brakes.
me.” This seems understandable, as privacy-enhancing Presumably, those fears stem from media reports describing
strategies, like those observed in individuals’ responses to data how attackers managed to remotely manipulate connected cars
requests on the web (e.g., intentionally entering false in multiple ways, such as the case of a Jeep Cherokee, which
information; see Son and Kim 2008), are not transferable to was intercepted and driven off the road (Wired.com 2015).
the context of connected cars and other IoT devices. The Despite manufacturers’ efforts to prevent such incidents
inevitability of data collection and the lack of control over data through security systems, the fear of losing physical control
flows make drivers of connected cars feel they are being over the vehicle will continue to influence individuals’
observed and exposed (see Item 6 in Table 2). perceptions and the adoption of connected cars.

1870 MIS Quarterly Vol. 45 No. 4 / December 2021


Cichy et al. / Privacy Concerns and Data Sharing in the Internet of Things

Looking at the statements referring to negative consequences which they were granted permission. Interestingly, many
associated with connected cars, we made two further participants—even though they were consciously aware of the
meaningful observations. First, notable risk spillovers may potential negative consequences of data sharing—were
occur, such that negative consequences are not limited to the convinced that they, themselves, would remain unaffected.
drivers, but instead also affect individuals in close physical The main reason for attributing a low likelihood to negative
proximity (e.g., family members, co-workers, car passengers scenarios was, presumably, their trust in the data-requesting
or other drivers as well as pedestrians) and even society at party who they believed “will do everything right.” We found
large (e.g., collective surveillance) (see Items 6 and 15 in particularly high levels of trust in cases where participants
Table 2). Related to this argument, some negative imagined that the party receiving data was the manufacturer
consequences emerge when IoT devices, especially those with of their car: “With the car manufacturer? Of course I’d share
high acquisition costs, are shared with others, and the data my data, because, after all, I trust them.” In contrast, less
become incorrectly associated with another individual (see trusting participants stated that other parties “don’t keep their
Item 9 in Table 2). Second, particularly revealing or promises” and generally questioned the integrity of data
consequential data streams need not necessarily be fueled by requesting parties. Some of these participants were especially
the output of embedded sensors of connected cars only. suspicious of internet companies that already engage in
Instead, data streams combined from a multitude of IoT intensive data collection in other contexts. One participant
devices can make individuals particularly vulnerable to stated, “Google is already observing everything I do online. I
unsolicited contacts by commercial entities and attacks by really would not like them to monitor my driving as well.”
hackers. Privacy-related risks might hence accumulate and
potentiate within IoT ecosystems (e.g., see Item 3 in Table 2 Participants also pointed to the technical issues related to the
for home security technologies). protection of driving data. Statements referring to information
security of connected cars were usually negative: participants
However, it was also insightful to learn that—consistent with stated that data cannot be sufficiently protected. One
the notion of privacy calculus—participants weighed the participant said: “It’s like the computer. No system is 100
consequential loss of control of personal driving data against percent secure. Nobody can guarantee you full security.”
the benefits they could receive in return. In regard to sharing Regarding their overall privacy concerns, participants
data with the car manufacturer, one participant stated, “I must reflected on the sensitivity of their driving data. Interestingly,
admit, I would be somewhat concerned. But if it would pay they came up with very different answers. One participant
off, I would think about it.” Another said “I don’t want to said, “I don't care about my driving style. I don’t view this as
become fully transparent, to be honest. At least not if it doesn't sensitive data that should be protected.” Another participant
come with any benefits for me.” Study participants mentioned stated the opposite: “I think information on my driving style
a broad set of potential benefits that could result from sharing is as sensitive as data can get. I would be extremely concerned
personal driving data. Among the most cited were monetary if somebody got access to this data.”
payments, supplementary personalized services, technical
improvements of vehicles, emotional benefits resulting from
Feelings of Ownership toward Driving Data
supporting research and development projects, and direct
contributions to road safety or improved traffic flows.
Participants’ responses indicated that the feelings of data
ownership may play an influential role in their sharing
Determinants of Negative Consequences decisions. Participants often emphasized words such as “my,”
“mine,” and “not theirs” in their answers, and, directly or
Whether or not connected car drivers will actually face one or indirectly, claimed ownership of the types of data that
more of the negative consequences identified above will connected cars can capture and transmit. Such claims came with
ultimately be determined by (1) the intentional and a dismissive attitude toward the sharing of personal driving data
unintentional actions of the party with whom driving data are in general and a strong desire to exclusively control such data
shared, (2) the level of data security embedded in connected in particular: “No one gets my data, simply as a matter of
car technology, and (3) the types of driving data that are principle”; “In the end, this is my data, right? Companies should
captured and shared. Drivers seem to make judgments keep their hands off it.” In contrast to their more detailed
concerning these three factors through assessing the elaborations on their privacy concerns, individuals rarely
trustworthiness of the party receiving the data, the expected differentiated between the types of driving data they claimed
level of data security, and the sensitivity of shared data. For possession of. In their statements, participants instead referred
the participants, trust in the other party meant believing that to personal driving data as a whole. The interviews were also
any shared personal driving data would be handled helpful in shedding light on why individuals wanted to preserve
appropriately and not used for purposes other than those for their feelings of possession toward their driving data.

MIS Quarterly Vol. 45 No. 4 / December 2021 1871


Cichy et al. / Privacy Concerns and Data Sharing in the Internet of Things

Interviewees indicated that preferred car settings, radio in the sharing of low-sensitivity data, such as information on
channels, routes traveled, and driving style are indicative of cars’ engine oil levels and engine temperatures. Lastly, our
who they are as individuals. One respondent said, “I think I qualitative study highlighted the continued relevance of the
have some kind of attachment to my driving data, just as I privacy calculus perspective as a basic theoretical
have to any other kind of personal information. [Having framework—which, however, needs to be expanded into a
control over driving data] is important to me, because it contextualized model of sharing decisions that incorporates
reflects what kind of driver I am.” Moreover, driving data psychological ownership effects and contextual risk factors
seem to help people to take care of their vehicle and operate such as trust, data sensitivity, and data security.
it safely and skillfully. One respondent expressed this in the
following way: “The data is important because it can help
me to drive better … For example, if I do not shift gears
correctly, I might damage the engine. So, the driving data Emergent Conceptual Model
can help me to protect my engine.” Another participant said,
“[Driving data] helps me to better understand how my car Psychological Ownership
behaves.” Emphasizing a different aspect, another
participant stated, “The way how people drive can be Individuals are known to feel a connection with various
improved by looking at the data. For example, it can become objects of possession—including, and especially, homes and
more environmentally friendly.” These statements express cars (Dittmar 1992). Feelings of ownership can also emerge
feelings of self-efficacy and self-identity, motives that are regarding nonphysical objects such as ideas, creative
said to cause individuals’ desire to experience possession endeavors, and personal data (Anderson and Agarwal 2010;
(Pierce et al 2003). While the state of psychological Pierce et al. 2003). This is also the case for driving data, as
ownership relates to a cognitive perception (e.g., “It’s my observed in our exploratory interview study (Study 1). The
private car and my private data is in it”), individuals seem to concept of psychological ownership generally refers to a state
experience it as an emotional and affective sensation. Such in which individuals feel as though the object is “theirs” (Pierce
feelings of ownership appear to determine participants’ et al. 2003) without presupposing legal ownership (Furby
desire to control access to their driving data. 1980). Ownership feelings develop through individual
experiences of controlling the object, getting to know the object
Consistent with Venkatesh et al.’s (2013) statement that intimately, and investing the self into the object (Pierce et al.
interviews “can provide depth in a research inquiry by 2001). Such feelings are meaningful in practice as they are
allowing researchers to gain deep insights from rich shaped by—and, in turn, shape—individuals’ self-concept and
narratives” (p. 25), Study 1 complemented extant privacy perceived self-efficacy (Belk 1988). Consistent with this notion,
research and informed our exploration of data sharing Pierce and colleagues (2001) proposed that psychological
decisions in IoT contexts. First, we were able to confirm that ownership is rooted in a set of various motives, namely efficacy,
privacy concerns play a significant role in data sharing effectance,4 self-identity, and home (i.e., the state of having a
decisions in the context of connected cars and we shed light physical space to inhabit). Considering the fundamental role of
on the nature of those concerns. Most importantly, our perceived possessions in human lives (Belk 1988), it is not
findings highlight the fact that IoT devices introduce an surprising that psychological ownership has been found to
additional category of the negative consequences of data affect individual behavior in significant ways. Such behavioral
sharing—namely, direct threats to users’ physical safety. consequences include protective behaviors vis-à-vis the object
These fears originate from the fact that IoT devices are (Brown et al. 2005) and greater stewardship (Korman 1970).
equipped with actuators that manipulate their immediate Anderson and Agarwal (2010) demonstrated that psychological
environment based on data signals. It is this intrusion into ownership toward computers increases users’ intention to
users’ informational and physical space that renders the protect them by means of antivirus software or firewalls. Other
privacy implications of IoT devices different from those scholars observed that individuals with feelings of ownership
related to online social networks (Krasnova et al. 2009) or over their ideas resist sharing them with colleagues (Webster et
online services in general (Karwatzki et al. 2017a). Second, al. 2008) and seek to retain exclusive control over them (Choi
our qualitative study points to the so-called psychology of and Levine 2004; Pierce et al. 2009). These behaviors appear
possession and property (Peck and Shu 2009) as a fruitful because of emotional distress from losing control over objects
avenue to arrive at a more holistic understanding of data that individuals perceive to own (Bartunek 1993) and the
sharing decisions in IoT contexts. This is particularly feeling of loss of pleasure (Heidegger 1967).
promising, as psychological ownership might even be a factor

4
Effectance is defined as the state of having a causal effect on an object in the
environment and constitutes the motivational aspect of competence (White 1959).

1872 MIS Quarterly Vol. 45 No. 4 / December 2021


Cichy et al. / Privacy Concerns and Data Sharing in the Internet of Things

Feelings of ownership toward personal driving data vary concerns have been found to negatively affect individuals’
between drivers, not least as a function of whether they own, willingness to share personal data (Culnan and Armstrong
rent, or share their car with others (one respondent 1999; Dinev and Hart 2006). Studied extensively in privacy
commented: “It’s a company car anyway, I don’t care about research, this relationship has been successfully reproduced
the data”). However, consistent with the literature on across various contexts (see synthesis in Li 2011). Indeed,
psychological ownership and our insights from Study 1, we some scholars claim privacy concerns to be the single most
expect car drivers’ feelings of ownership toward data important factor in individuals’ disclosure decisions
collected by the various sensors of their connected cars to (Malhotra et al. 2004; Stewart and Segars 2002). Linking the
shape their data sharing decisions. Drivers with strong concepts of psychological ownership and privacy concerns in
feelings of ownership regarding their driving data are likely to shaping individuals’ sharing decisions, we hypothesize:
resist any request for data sharing, irrespective of privacy
concerns. Given the theoretical arguments and empirical Hypothesis 2: Privacy concerns will mediate the negative
evidence as well as our inductive insights from Study 1, we relationship between drivers’ psychological ownership of
expect psychological ownership of driving data to trigger driving data and their likelihood of sharing that data.
protective behaviors thereby reducing data sharing. Thus,

Hypothesis 1: The stronger the drivers’ psychological The Role of Contextual Privacy Risk Factors in
ownership of driving data, the less likely they will be willing Shaping Privacy Concerns
to share that data.
We conceptualize contextual privacy risk factors as important,
context-informed antecedents of individuals’ privacy
The Mediating Role of Privacy Concerns concerns. Behavioral or technological in nature, these risk
factors jointly determine how likely and severe individuals
We expect privacy concerns to mediate the negative expect the negative consequences of personal data sharing to
relationship between feelings of ownership of driving data and be. The three privacy risk factors that emerged as particularly
the likelihood of sharing it with other parties. Exhibiting relevant in our interviews were relational trust, data sensitivity
control over an object was identified as one of the most and data security.
important paths to the development of psychological
ownership toward the object (Pierce et al. 2003). In general, Consistent with our exploratory insights from Study 1, privacy
the greater the control over an object, the greater the perceived literature suggests that relational trust plays an important role
loss when control is diminished (Pierce et al. 2003). Applied in diminishing privacy concerns (e.g., Gefen et al. 2003;
to our context, losing the ability to control personal data would Krasnova et al. 2010; Pavlou et. Al 2007). Relational trust is
prevent drivers from maintaining and affirming their the belief that the other party in a relationship possesses
emotional attachment to their data. Losing control over characteristics that inhibit it from engaging in opportunistic
personal data, as we have seen, might also entail more direct behavior (Dinev and Hart 2006; McKnight et al. 2002). In line
negative consequences, the likelihood and severity of which with this conceptualization, individuals depict data sharing
individuals anticipate in their overall level of privacy concerns with a trusted stakeholder as less risky because intentional
(Smith et al. 1996). Individuals hence seem to benefit from not data misuse is perceived to be less likely (Gefen et al. 2003).
disclosing personal data in two ways: it allows them to Because relying on trust is an effective human strategy for
preserve their emotional attachment toward their data and it dealing with uncertainty in general (Luhmann 1979), it is
reduces their privacy concerns. This is why we expect that likely to be a part of the decision heuristics that are relevant to
individuals with stronger feelings of data ownership will also data sharing decisions in IoT settings as well. In the case of
express more concerns about privacy when faced with a the connected car, as with many other IoT devices, sharing
request to share their driving data. data with another party is usually not a single, one-time
transmission. Instead, individuals typically agree to
Conceptually, psychological ownership describes a state of continuous or repeated transmissions of data. In the case of
emotional attachment to an object (e.g., driving data) telematic insurance, for example, drivers let insurers monitor
associated with the desire to maintain control over it and can their driving behavior to receive discounts for careful driving
exist regardless of legal ownership (Pierce et al. 2003). (Vaia et al. 2012). In this case, it is difficult for the individual
Privacy concerns, in contrast, pertain to the negative driver to assess ex ante the extent to which the ongoing
consequences an individual associates with a loss of the ability transmission of sensor-generated data could become a point of
to control personal information as a result of sharing it with concern in the future. For example, a car might record and
another party (Xu et al. 2012). In line with their role as the transmit speeding data, which when combined with other
predominant cost dimension in the privacy calculus, privacy information, will reveal sensitive insights through the use of

MIS Quarterly Vol. 45 No. 4 / December 2021 1873


Cichy et al. / Privacy Concerns and Data Sharing in the Internet of Things

advanced analytics. Drivers’ privacy concerns are hence likely especially those that pertain to the device’s inner workings
to be shaped not only by the type of data to be shared with (e.g., engine temperature, mechanical wear and tear) and do
another party but also by their expectations about how this not allow inferences to be drawn about individual users’
party will handle their data once shared. preferences and behaviors. Findings from basic research on
human cognition suggest that the observed variance in data
Though studies conducted in other settings also point to the sensitivity ratings might also stem from systematic over- and
role of trust, they do not provide consistent suggestions on the underestimations. The cognitive biases that lead to such a
positioning of trust in empirical models to explain individuals’ phenomenon have been repeatedly observed in situations in
data sharing decisions. While some scholars have suggested which individuals are exposed to high levels of uncertainty
that trust enhances sharing intentions directly (e.g., Dinev et (Reyna 2004). Indeed, most drivers might struggle to
al. 2006), others view trust as an antecedent (e.g., Krasnova et recognize what different types of sensor-generated data can
al. 2010) or outcome of privacy concerns (e.g., Bansal et al. reveal about them, especially when combined from multiple
2010). These differences, however, seem to be based on how sources and analyzed with advanced machine learning
the risk-mitigating factor was conceptualized, namely as a techniques. The complexity of such data sensitivity
general (general institutional trust, as in Dinev et al. 2006) or assessments is arguably much higher in the IoT domain than
context-specific belief (relational trust, as in Krasnova et al. in contexts that users are familiar with, such as data on health
2010). Adopting the concept of relational trust, we argue that status as documented in medical records or profile data from
trust in the requesting stakeholder will directly impact social network sites.
individuals’ privacy concerns in a given disclosure situation.
Thus, we hypothesize: Overall, we expect users’ privacy concerns to increase with
the perceived sensitivity of the data to be collected and shared.
Hypothesis 3: The higher the level of trust in the data- We thus propose:
requesting party, the lower drivers’ privacy concerns will be.
Hypothesis 4: The higher the level of perceived data
A second contextual privacy risk factor that our interviews sensitivity, the higher drivers’ privacy concerns will be.
revealed is the perceived sensitivity of personal driving data,
which might be low for some types of data (e.g., outside A final relevant observation from Study 1 regarding
temperature) and high for other types of data (e.g., contextual risk factors pertains to individuals’ perception of
acceleration patterns). Data sensitivity can be defined as the data security. Overall, 24 of our interviewees (20% of the total
extent to which information, if released or shared, could cause sample) mentioned concerns about nefarious actors like
harm to the subject of the information (Gandy 1993): for hackers and cybercriminals, evoking the substantial security
example, in the form of social stigma (being viewed as a poor challenge that IoT technologies present (Poudel 2016).
driver; see Item 13 in Table 2), discrimination (being charged Indeed, interconnected systems that involve various devices,
higher rates for car rental because one’s driving style was computers, and transmission technologies (e.g., Bluetooth,
assessed as risky; see Item 5 in Table 2), criminal prosecution Wi-Fi, mobile broadband) offer multiple entry points for
(in case of serious speeding; see Item 2 in Table 2), or loss of attacks. Given that the vulnerability of a system is determined
property (being robbed in times when the car’s location by its weakest link, ensuring security in the IoT is particularly
indicates that nobody is home; see Item 3 in Table 2). Studies challenging (Lowry et al. 2017). Cybersecurity challenges are
have repeatedly shown that individuals’ willingness to share a particular concern in the domain of connected cars, not least
decreases as data sensitivity increases (e.g., Nowak and because of their potential physical impact (NHTSA 2018).
Phelps 1992; Phelps et al. 2000). Smith et al. (2011) conclude Governments around the globe are increasingly aware of such
that the level of data sensitivity might be the most important security threats and are enacting legislation and voluntary
contextual factor in individuals’ sharing decisions. As the measures to tackle security issues (Lim and Taeihagh 2018).
sensitivity of data directly relates to the severity of negative
consequences associated with disclosure (Mothersbaugh et al. As scholars have frequently noted (e.g., Belanger et al. 2002;
2012), it can be conceptualized as a further antecedent of Landwehr et al. 2012), while information privacy and security
context-specific privacy concerns (Kehr et al. 2010). are conceptually related, they are two distinct concepts.
However, this distinction is inadequately reflected in models
Especially in the IoT context, there is substantial variance in of disclosure decisions (Smith et al. 2011). While privacy
the sensitivity that individuals assign to different data types refers to the level of control an individual can exert over the
being collected. This simply reflects the broad range of data collection and use of information tied to his or her identity,
types captured by the multitude of sensors that IoT devices security refers to the protection of data from unauthorized
such as connected cars are equipped with. Most types of data access as well as damage (Belanger et al. 2002) and is
collected are likely to be of low perceived sensitivity, associated with three goals (Von Solms and Van Niekerk

1874 MIS Quarterly Vol. 45 No. 4 / December 2021


Cichy et al. / Privacy Concerns and Data Sharing in the Internet of Things

2013): assurance of integrity (information should not be trigger feelings of depression (Formanek 1991). This raises
altered during transmission and storage), authentication (the the question of whether personal data could be used in such a
identity of the parties involved and their eligibility to access way that the motives of self-identity and effectance are
data should be verified), and confidentiality (data access and enhanced rather than threatened. In that case, the effect of
use should be confined to authorized parties and purposes psychological ownership on individuals’ data-sharing
only). Meeting these three security goals is a necessary decisions might weaken or even change direction, thus
precondition for privacy (Acquisti et al. 2016; Smith et al. increasing data sharing rather than decreasing it. Such a
2011). Indeed, when personal data are protected from, for change could occur as a result of introducing carefully
example, third-party access, meaningful control over their designed incentives that enable individuals to enhance their
usage becomes possible. Since the findings of Study 1 self-efficacy and contribute to their self-identity as a direct
revealed that drivers are aware of the link between security consequence of disclosing their personal data. Consider the
and privacy, we expect perceived data security to act as an case of sharing your driving data in exchange for obtaining
important contextual risk factor, along with trust and data access to real-time feedback on your driving style that helps
sensitivity, for shaping privacy concerns in the IoT. Thus, we you improve your driving and confirms your self-image of
predict: being a skillful driver. While the idea of compensation rests
on inhibiting feelings of ownership, such subtler incentives
Hypothesis 5: The lower the level of perceived data security, help to foster psychological ownership as an emotional state
the higher drivers’ privacy concerns will be. through sharing, thereby redirecting the influence of
psychological ownership on individuals’ behavior. In this
case, both the negative effect of psychological ownership and
The Moderating Role of Incentive Design for Data the need for compensation might dissipate.
Sharing
Hui et al. (2006) identified self-enhancement, defined as the
Notwithstanding the protective tendencies associated with improvement of the self-concept in relation to others, as a
psychological ownership, individuals are typically willing to major motivation to share personal data. They found that
give up (perceived) ownership of an object in return for some incentives that build up one’s self-concept—such as a premier
form of compensation (Belk 2010). Virtually all objects of membership that reflects a certain status in an online
ownership, whether material or nonmaterial, are exchangeable community—increased individuals’ intention to share
for a corresponding countervalue. Offering a countervalue personal information. Perhaps the most compelling evidence
(e.g., a monetary payment or a service) in this regard typically for the validity of our arguments, however, is the massive self-
serves as compensation for reducing feelings of ownership. In disclosure observable on online social networks such as
our final hypotheses, we propose an alternative managerial Facebook (Chen 2013). Despite expressed privacy concerns
strategy to mitigate drivers’ protective tendencies toward their (Krasnova et al. 2010; Xu et al. 2008), large-scale data leaks
driving data—namely by understanding and appealing to (e.g., the Facebook—Cambridge Analytica controversy in
those motives that explain why individuals develop feelings 2018), and perceived psychological ownership (Spiekermann
of ownership in the first place (Pierce et al. 2003). et al. 2012), individuals continue to share highly personal data
on online platforms, with the expression and enhancement of
Possessions help individuals define themselves, express their their identity being the key motivators (Schau and Gilly 2003).
self-identity to others, and maintain continuity of the self
(Price et al. 2000). This is the case even when the possession Early privacy research found that individuals do not usually
is personal data. As cited earlier, drivers we interviewed receive direct benefits from sharing personal data on the
indicated attachment to their driving data. Further, internet but receive indirect benefits when the sharing enables
possessions help their owners achieve desired outcomes, both them to conduct transactions online (Dinev and Hart 2006).
socially and physically (Dittmar 1992). As a case in point, our Revealing one’s shipping address and credit card information
interviewees viewed their driving data as a means to better to e-merchants, for example, makes it possible to order
understand their car, avoid damaging it, and, ultimately, products offered in online stores. Similarly, sharing personal
become a better driver. financial information with a banking institution makes it
possible to use e-banking services. Beyond being a
Pierce et al. (2003) concluded that psychological ownership prerequisite for the use of various digital services offered
must serve the motives of self-identity as well as efficacy and today, disclosing personal information allows individuals to
effectance. It is thus not surprising that some suggest that a increase their ability to derive value from digital services
loss of perceived ownership can bring “a sense of shrinkage while decreasing their effort. For example, when registering
of our personality, a partial conversion of ourselves to for Pinterest, users are asked to specify topics they are most
nothingness” (James 1890, p. 293). Loss of possessions can interested in. This ensures that only relevant content will be

MIS Quarterly Vol. 45 No. 4 / December 2021 1875


Cichy et al. / Privacy Concerns and Data Sharing in the Internet of Things

shown to them during their future visits to the image-based settings (i.e., relational trust, data sensitivity), we advance
social network. Other companies infer users’ interests directly current knowledge by explicating their critical role in IoT
from past interactions with their service: for example, the contexts. Regarding factors that have been underexplored or
viewing histories of YouTube users determine future video not yet considered by privacy scholars (i.e., data security and
recommendations. Whether the process is enabled by psychological ownership), we provide new insights. These
deliberate data sharing or not, the exploitation of personal data highlight the important role of individuals’ perceptions of data
to make a digital service more relevant allows individuals to security and psychological ownership in shaping their data
reach their goals more efficiently—e.g., quickly finding sharing behavior. As such, they also illustrate the potential of
videos that fit their preferences. Some privacy scholars have a more nuanced view of incentives for data sharing that
tried to capture individuals’ perception of this form of self- connect to more subtle human needs.
efficacy enhancement via gains in convenience while
performing certain online tasks. Hui et al. (2006), for example,
found that expected time savings can motivate consumers to
share personal information on the web. Similarly, Hann et al. Study 2: Psychological Ownership and
(2007) showed that users are willing to give up some of their Privacy Concerns
privacy to gain more convenience when using websites that
are personalized to their preferences. These self-efficacy Methods
enhancing effects are likely to be even stronger for IoT
devices and services that offer data-driven personalization Design: In Study 2, we collected survey data and conducted
advantages—for example, smart shoes that suggest the most structural equation modeling (SEM) to examine the extent to
effective training plan based on its wearers past running which psychological ownership negatively affects drivers’
history and vital signs. willingness to share driving data (Hypothesis 1) and whether
this effect is mediated by privacy concerns (Hypothesis 2). To
Synthesizing our theoretical arguments, we expect the design test our hypotheses, we measured our core variables after
of incentives that appeal to motives of psychological exposing subjects to a hypothetical data sharing scenario
ownership to be a particularly effective and legitimate (Rosenthal and Rosnow 2006). Scenario-based approaches
managerial strategy for encouraging data sharing in the face are appropriate for studying novel technology contexts (Sheng
of feelings of ownership. Thus, we posit: et al. 2008) and are commonly used in the privacy literature
(e.g., Anderson and Agarwal 2011; Malhotra et al. 2004; Xu
Hypothesis 6a: Incentive design that enhances drivers’ self- et al. 2009). Accounting for the situational sensitivity of
efficacy will moderate the relationship between psychological privacy decision-making (e.g., Anderson and Agarwal 2011;
ownership of driving data and the likelihood of data sharing, Acquisti et al. 2012), we provided all subjects with specific
such that the higher the self-efficacy enhancement associated information on contextual factors that have been shown in
with the incentive employed, the weaker the negative effect previous literature to influence individuals’ sharing intentions.
will be. These comprise the type of data (Phelps et al. 2000), the
requesting stakeholder, and the declared purpose of data usage
Hypothesis 6b: Incentive design that is congruent with (Anderson and Agarwal 2011). We carefully selected
drivers’ self-image will moderate the relationship between contextual factors to create a realistic scenario related to the
psychological ownership of driving data and the likelihood of use of connected cars, while minimizing trigger effects on
data sharing, such that the higher the congruence of the both the positive (e.g., arguments for sharing) and the negative
incentive employed with drivers’ self-image, the weaker the (e.g., concerns) side, which could otherwise bias our findings
negative effect will be. (see Figure A1 in the Appendix). To further reduce ambiguity,
the scenario description also contained information on
We show our full conceptual model in Figure 2. While the connected car technology.
focus of Study 2 is on testing for mediation (Hypotheses 1 and
2), Study 3 tests the effects of contextual risk factors Procedure: We administered a paper-based questionnaire, in
(Hypotheses 3 and 4) as well as the moderating role of person, to 333 car drivers in Germany (age: M = 32.3 years,
incentive design (Hypotheses 6a and 6b). In sum, our SD = 15.6; gender: 45% female; car usage: M = 12,765
conceptual model provides a sense of continuity with extant km/year, SD = 11,448; legal ownership of car: 46%; Facebook
privacy research but also introduces extensions to existing users: 70 %). This data collection approach allowed us to
models of disclosure decision-making that emerged from our include in our sample individuals who would normally not
deep engagement with the context of IoT in general and participate in standard mail or online surveys (see Basi 1999),
connected cars in particular. Regarding the determinants of increasing the variety of views in our study.
data sharing that have been detected by studies in other

1876 MIS Quarterly Vol. 45 No. 4 / December 2021


Cichy et al. / Privacy Concerns and Data Sharing in the Internet of Things

CO NTEXTUAL PRIVACY RISK FACTO RS

Relational Data Data


Trust Sensitivity Security

H3 (-) H4 (+) H5 (-)

Privacy
Concerns

H2 (mediation)

Sharing of
Psychological H1 (-)
Personal
Ownership H6a H6b Driving Data
(moderation) (moderation)

Self-Efficacy Self-Image
Enhancement Congruency

INCENTIVE DESIG N

Figure 2. Research Model

We asked participants to imagine a situation in which they information, errors in personal information, and improper
were approached by the manufacturer of their car with a access to personal information. Agreement to statements was
request for real-time driving data on braking, acceleration, and measured on a 7-point scale (where 1 = strongly disagree and
speed, which would be used in an internal research project 7 = strongly agree). The construct of psychological ownership
(full scenario description in the appendix). The driver retained originally formulated by Pierce et al. (2001) was captured by
the right to opt out at any time. No compensation was offered a well-established four-item measure employed in previous
in return for data sharing. After reading the scenario studies (Anderson and Agarwal 2010; Peck and Shu 2009).
description, participants filled out a set of survey questions. Table A4 in the Appendix lists all measurement items and
their respective sources. In addition to the core constructs, the
Whenever possible, we adapted measures from previously survey also captured relevant control variables, including age,
validated multi-item scales to ensure adequate reliability and gender, education, higher education, car usage (in km/year),
validity. Drivers’ willingness to share driving data served as and whether the participant is an active user of online social
a proxy for actual data sharing and was measured by a single networks (OSN usage).
item linked to the specific connected car scenario (“How high
is your willingness to participate in the research project and Analysis: We used covariance-based SEM techniques, as
let your car transmit the aforementioned driving data to your implemented in AMOS 25, to test the proposed
car manufacturer?” measured on a 7-point Likert-type scale relationships. Following a two-step approach, we first
where 1 = very low and 7 = very high).5 To capture privacy assessed the quality of our measures through the
concerns, we drew upon the widely employed concern for measurement model and then tested hypotheses through the
information privacy scale (CFIP) developed by Smith et al. structural model. Maximum likelihood estimations were
(1996) and followed the common procedure of adapting the employed for the model assessment. To establish mediation,
wording of the items to the study context (e.g., Xu et al. 2012). we followed the procedure proposed by Preacher and Hayes
CFIP is conceptualized as a reflective second-order factor (2004) and examined the significance of the indirect effect
comprising four first-order components: collection of with bootstrapped data.
personal information, unauthorized secondary use of personal

5
A single-item measure is appropriate here, as the concept pertains to a single
concrete object (specified set of driving data) and a single concrete attribute
(willingness to share data) (Bergkvist and Rossiter 2007).

MIS Quarterly Vol. 45 No. 4 / December 2021 1877


Cichy et al. / Privacy Concerns and Data Sharing in the Internet of Things

Table 3. SEM Model Explaining Differences in Drivers’ Willingness to Share Personal Driving Data (Study 2)
Model 1 Model 2

-0.568*** -0.253***
PO → Willingness to share driving data
(0.066) (0.088)
0.505***
PO → PC –
(0.050)
-0.625***
PC → Willingness to share driving data –
(0.122)
PO → PC → Willingness to share driving data -0.316***

(indirect effect) (0.077)
Note: n = 333; PO = psychological ownership; PC = privacy concerns; controls = age, gender, education, higher education, car usage (in
km/year), OSN usage; estimates are unstandardized coefficients. Standard errors in parentheses. *p < 0.10; **p < 0.05; ***p < 0.01.

Findings and Discussion After establishing the basic mediation of our proposed
research model, we designed Study 3 to examine the role of
Results of a confirmatory factor analysis (CFA) supported the incentives and contextual privacy risk factors in individuals’
convergent and discriminant validity of all multi-item scales. sharing decisions. As such, Study 3 benefits from and builds
The reliability of all scales exceeded the generally accepted upon Study 2, which explicated the fundamental interplay
threshold of a Cronbach’s alpha of 0.7 (Nunnally 1978). Model between psychological ownership, privacy concerns, and data
fit indices suggested that the proposed factor structure has a sharing, in order to establish a more comprehensive model of
reasonably good fit with the data (CFI = 0.99, TLI = 0.99, SRMR individuals’ sharing decisions.
= 0.04, and RMSEA = 0.04) (Hu and Bentler 1999). The
unstandardized regression weights of the structural model
provided support for the hypothesized relationships (Table 3).
Consistent with the theoretical arguments, we found a statistically Study 3: Privacy Risks and Service
significant negative effect of psychological ownership on Design in Data Sharing Decisions
individuals’ willingness to share driving data (Model 1: B = -
0.568 p < 0.01). Hypothesis 1 was thus supported. Methods
To test whether this effect is mediated by a third variable, as Design: In Study 3, we examined how contextual privacy risk
expressed in Hypothesis 2, we estimated indirect effects. In factors affect individuals’ privacy concerns (Hypotheses 3-5)
this regard, bootstrap analyses with 5000 replications revealed and the extent to which incentive design can mitigate the
that the negative effect of psychological ownership on protective tendencies associated with strong feelings of
willingness to share is indeed mediated by privacy concerns, ownership (Hypotheses 6a and 6b: Moderation). To test these
as indicated by a significant indirect effect (Model 2: B = - extensions of our previously established mediation model, we
0.316, 95% BC CI [-0.480, -0.179]). Thus, Hypothesis 2 was conducted a field study designed to observe drivers’ actual
supported. The weakened but detectable direct effect of data sharing decisions in the presence of an incentive.
psychological ownership on willingness to share (Model 2: B
= -0.253, p < 0.01) suggests a complementary mediation We conducted a survey-based pilot study to inform the design
(Zhao et al. 2010). We replicated the empirical findings in of our field experiment. We surveyed car drivers in Germany
support of H1 and H2 across two additional independent (n=346) to identify incentive mechanisms that would mitigate
samples. One sample (n1=131) was composed of active users the negative effect of psychological ownership on drivers’
of Facebook, the other of technically educated employees of a data sharing decisions. Consistent with our theoretical
major German car manufacturer (n2=730). In the light of perspective, only those incentives that appealed to the motives
arguments that, for example, digital natives or individuals of psychological ownership neutralized the negative effect of
with technical knowledge might systematically differ in their psychological ownership. This was especially true of the data-
disclosure decision-making (Dinev and Hart 2006; Marwick enabled service of real-time driving analysis (digital driving
et al. 2010), the successful replication of our findings assistant), which can potentially enhance both one’s self-
increased our confidence in their validity. identity as a skilled driver and one’s perceived self-efficacy of
being able to control the car at all times. To translate these

1878 MIS Quarterly Vol. 45 No. 4 / December 2021


Cichy et al. / Privacy Concerns and Data Sharing in the Internet of Things

insights from our survey-based pilot study into our field study, agree). We asked participants to rate the sensitivity of the
we established a research collaboration with TomTom requested data (data sensitivity) on a scale of 1 (not sensitive)
Telematics, a leading automobile device company, to obtain to 7 (very sensitive) (Kehr et al. 2015). To assess how much
access to a retrofittable plugin telematics device known as the the driving analytic service that was offered appealed to
TomTom Curfer along with its associated mobile app. participants’ self-identity, we used the self-image
congruency scale developed by Sirgy et al. (1997). Self-
Procedure: In the first step, through an online survey, we efficacy enhancement referred to how much the drivers
identified 324 individuals who met our selection criteria of (1) believed that using the digital driving assistant would
owning or having regular access to a car built in 2004 or later improve their driving ability; our measurement here was a
and (2) having a smartphone with a mobile data plan. Our final reformulated scale that originally assessed how well a
sample (age: M = 32.2 years, SD = 15.0; gender: 43% female; specific website supports an individual in online shopping
frequent driving: 80%; legal ownership of car: 58%; online (Childers et al. 2001). In the absence of suitable established
social network users: 76%) contained car drivers who were all scales for individuals’ perceptions of data security that are
capable of making use of the telematics service we offered sufficiently distinct from privacy concerns, we developed a
from a technical viewpoint. In the second step, the 324 new instrument that captures the extent to which individuals
selected participants were asked to provide information on key consider their personal data in a specific setting to be at risk
constructs described below and to decide whether or not to of being corrupted, manipulated, deleted, or accessed
enroll as a test driver and share driving data through the without authorization during data collection, storage, or
TomTom Curfer with real-time driving analysis capabilities. transfer. As part of the scale development process, we
The device they would then receive is easy to install and generated an item pool based on prior data security literature.
transmits personal driving data6 to TomTom, the service We tested this item pool among a student sample (n=159;
provider. In return for agreeing to share their driving data, detailed results are available from the authors upon request).
participants obtained full access to all functionalities of the Based on results from exploratory factor analyses, we
digital driving assistant—real-time feedback on acceleration, eliminated items with low loadings and retained four items
cornering and braking, insights into car performance, trip (Cronbach’s alpha = 0.83).
review, and driving style badges—through the TomTom
Curfer smartphone app. To enable informed decisions, we We also captured relevant control variables pertaining to
provided individuals with background information on the personal characteristics including age, gender, education,
underlying technology and the associated data flows. higher education, and active use of online social networks
(OSN usage). Additionally, we captured information on
Our dependent variable, sharing of driving data, constitutes a whether an individual used his or her car frequently
discrete choice and was hence measured by a binary variable. (frequent driver = more than two rides per week) and
This variable takes the value of “1” if a participant agreed to exclusively (exclusive driver).
share her driving data and physically installed the telematics
device in her car and of “0” otherwise. As such, Study 3 is one Analysis: Given the binary nature of our dependent variable,
of the first in the field of privacy research to go beyond we used Bayesian structural equation modeling (BSEM) as
intentions to share and has captured actual data sharing implemented in IBM SPSS Amos 25 to test our hypotheses
decisions in a field setting.
(Byrne 2016). In contrast to the frequentist approach, the
Bayesian method views parameters (e.g., the mean) as
To measure privacy concerns associated with the digital
variables instead of constants. This requires making fewer
driving assistant, we relied on the same scale as in Study 2
asymptotic assumptions than, for example, maximum
(i.e., CFIP, again modeled as a reflective second-order
factor). We also used the same instrument to measure likelihood estimation (Muthén and Asparouhov 2012). We
psychological ownership toward the requested personal set noninformative priors and used the Markov chain Monte
driving data. We captured participants’ perception of the Carlo (MCMC) algorithm to draw random values of
service provider’s trustworthiness (relational trust) with a parameters from a high-dimensional joint posterior
four-item scale (Tax et al. 1998) with 7-point Likert-type distribution (Gelman et al. 2004). To test our moderating
response options (1 = strongly disagree and 7 = strongly hypotheses, we specified latent interactions in our model
using the matched pairs approach (Marsh et al. 2004).

6
The requested data comprised “Information related to your vehicle and how engine load, engine throttle levels,” together with “Information that identifies
you drive it such as time of start/stop, idling time, speed and speed distribution, you, your vehicle, your OBD2 dongle, and your smartphone.”
distance driven, braking, cornering, acceleration, battery levels and status,

MIS Quarterly Vol. 45 No. 4 / December 2021 1879


Cichy et al. / Privacy Concerns and Data Sharing in the Internet of Things

Table 4. Bayesian SEM Model Explaining Data Sharing Decisions (Study 3)


Model 1 Model 2
Trust → PC (H3) -0.243 (0.070) *** -0.249 (0.071) ***
Data Sensitivity → PC (H4) 0.190 (0.043) *** 0.192 (0.043) ***
Data Security → PC (H5) -0.505 (0.064) *** -0.501 (0.064) ***
PO → Data Sharing 0.243 (0.182) n.s. 0.223 (0.177) n.s.
PC → Data Sharing -0.819 (0.271) *** -0.754 (0.274) ***
Self-efficacy enhancement
0.583 (0.164) *** –
→ Data Sharing
Self-efficacy enhancement x PO
→ Data Sharing (H6a) 0.649 (0.303) ** –

Self-image congruency
→ Data Sharing – 0.772 (0.299) ***

Self-image congruency x PO
→ Data Sharing (H6b) – 0.201 (0.175) n.s.

Note: n= 324; PC = privacy concerns, PO = psychological ownership; Controls = age, gender, education, higher education, OSN usage,
frequent driver, exclusive driver; Estimates are unstandardized coefficients. In lieu of standard errors, Bayesian estimation procedure in
SPSS Amos provides “posterior standard deviation” (in parentheses). n.s.= not significant
* 90% posterior interval does not contain zero (i.e., 95% of posterior draws is positive or negative),
** 95% posterior interval does not contain zero (i.e., 97.5% of posterior draws is positive or negative),
*** 99% posterior interval does not contain zero (i.e., 99.5% of posterior draws is positive or negative).

Findings and Discussion negatively related to individuals’ privacy concerns (Model


1: B = -0.243; Model 2: -0.249; 99.5% of posterior draws are
A CFA supported the convergent validity for all multi-item negative). Thus, Hypothesis 3 was supported. We further
measures (see Table A1 in the Appendix). The discriminant found that the more sensitive the requested data were, the
validity of our measurements is supported in all cases except higher the individuals’ privacy concerns tended to be (Model
one. In the case of self-image congruency and self-efficacy 1: B = 0.190; Model 2: 0.192; 99.5% of posterior draws are
enhancement, the square root of the average variance positive). Thus, Hypothesis 4 was supported. Regarding our
extracted (AVE) exhibits a slightly lower value than the third contextual privacy risk factor, we found that an
interconstruct correlation. Testing the hypothesized individual’s perceptions of data security significantly
moderators by estimating one model only is hence not affected his or her privacy concerns. The more negative
feasible (Hair et al. 2010). The reliability of all scales perceived data security was, the higher the privacy concerns
exceeded a Cronbach’s alpha of 0.7 (Nunnally 1978). Model reported by the study participants (Model 1: B = -0.505/
fit indices suggest that the proposed factor structure has a Model 2: -0.501; 99.5% of posterior draws are negative).
good fit with the data (CFI = 0.98, TLI = 0.97, SRMR = 0.04, Thus, Hypothesis 5 was supported.
and RMSEA = 0.05) (Hu and Bentler 1999).
Among the study participants, 24 participants (7.4%) agreed
Table 4 summarizes the median estimates of the structural to install the telematics device in their car and transmitted
relationships and indicates whether their 90%, 95%, and verifiable personal driving data to the service provider. The
99% posterior intervals contain zero. The results for both results of the BSEM explaining such sharing of driving data
models were obtained with 500 burn-in samples and 50,000 are consistent with findings from Study 2 on sharing
iterations. Relying on the measure suggested by Gelman et intentions. They indicate that privacy concerns significantly
al. (2004), our analysis exhibited a sufficient convergence of reduce the likelihood that a driver will decide to share
the chains with final values reaching 1.0014 (Model 1) and personal driving data (Model 1: B = -0.819; / Model 2: -
1.0018 (Model 2). 0.754; 99.5% of posterior draws are negative).

Consistent with the theoretical arguments, Bayesian Our moderation Hypotheses 6a and 6b suggested that
estimates indicated that perceptions of the trustworthiness of incentives—in our case, digital driving assistants—that
the requesting stakeholder—the service provider—were enhance drivers’ self-efficacy or are congruent with their

1880 MIS Quarterly Vol. 45 No. 4 / December 2021


Cichy et al. / Privacy Concerns and Data Sharing in the Internet of Things

self-image may contain the negative effect of psychological Implications for Research
ownership on data sharing. High correlation between self-
efficacy enhancement and self-image congruency (r = 0.82) Our findings contribute to the interdisciplinary field of privacy
and VIFs above commonly recommended thresholds research with contextual, theoretical, and methodological
suggested that results would be more reliable if the extensions of the literature on individual data sharing.
hypothesized moderating effects of both variables were
tested separately (see Model 1 and Model 2) (Hair et al. First, regarding our connected car context, we respond to calls
1995). The coefficient for the interaction term of self- for extending the boundaries of privacy research by exploring
efficacy enhancement and psychological ownership was novel empirical settings (Smith et al. 2011). The four
positive and significant (Model 1: B= 0.649; 97.5% of characteristics we identified to be determining for the nature
posterior draws are positive), lending support for Hypothesis of the privacy threats that users are exposed to while
6a. The nonsignificant coefficient for self-image congruency interacting with IoT devices (see section “Privacy in the IoT
x psychological ownership (Model 2: B = 0.201, 95% of context”) underscore the need for dedicated privacy
posterior draws are positive and negative), on the other hand, research—both empirical and conceptual—in this domain
(Lowry et al. 2017; Shim et al. 2020). These characteristics
does not indicate a moderating effect. Hence, Hypothesis 6a
exemplify both similarities to and differences from what has
was supported and Hypothesis 6b was not supported. This
7

been detected in other settings. For example, our findings


might result from the fact that the design of the digital
point out that individuals’ privacy concerns in the domain of
driving assistant was more effective at enhancing drivers’ IoT are triggered not only by virtual risks but also by physical
self-efficacy in the driving task (Mean= 3.65, SD=1.54) than ones like exposure to burglars, manipulation of vehicle
at facilitating the expression of their self-identity (Mean= functions, and car hijacking through hacker attacks. This
2.85, SD= 1.46). One possible explanation for rather low amalgamation of the virtual and the physical in the IoT points
average ratings of the digital driving style assistant’s level of to the need to treat informational and physical privacy as
self-image congruency lies in the design of its user interface. interdependent rather than independent concepts (Smith et al.
It could be described as rather neutral in the sense that it did 2011). Similarly, the technological developments associated
not contain visual cues that would evoke associations with with the IoT increase the potential for cross-fertilization
appealing themes in the context of personal mobility (e.g., a between research on behavioral aspects of privacy and
helmet for motorsport or a leaf for environmental technical aspects of security (Shim et al. 2020). Our Studies 2
awareness). Apart from this, Carter and Grover (2015) and 3 provide quantitative evidence for the link between
suggest that an IS artifact, in general, will be viewed as privacy and cybersecurity in the connected car context, with
integral to a person’s sense of self when it is woven into privacy concerns increasing as data security decreased. We
personal and social routines, which might apply less to the also argue that data sharing and device usability are uniquely
digital driving assistant employed in this study. intertwined in the IoT. Typically, IoT users only benefit from
the value proposition if they engage in unconstrained data
sharing despite the virtual and physical risks as well as the
psychological costs associated with it. This dilemma makes
General Discussion the question of appropriate compensation and incentive
mechanisms for users, as well as our theory and evidence in
The IoT, in general, and connected cars, in particular, are this regard, particularly relevant in the IoT. We argue that such
important and rapidly expanding contexts in which the insights extracted from connected cars might well generalize
physical and the virtual converge. This introduces novel to the broader class of IoT devices.
privacy challenges that are both increasingly prevalent and
still largely unexplored. We contribute to a deeper Second, with regard to theory, our research makes several
understanding of data sharing decisions and possible additions to extant conceptual models of data sharing
managerial options to enhance data sharing for service decisions that may well enhance the explanatory power and
innovation. The model of data sharing decisions that we relevance of privacy research against the emerging realities of
developed and validated in three complementary studies the IoT. This is of particular value in light of the growing
provides new insights into the nature and interplay of criticism of the privacy calculus perspective and its focus on
psychological ownership, privacy concerns, incentive privacy concerns as the predominant, if not the only,
design, and data sharing in the context of connected cars. perceived cost assumed to affect individuals’ sharing
decisions (Acquisti et al. 2015; Dinev et al. 2015; Keith et al.

7
We also tested all hypotheses using standard logistic regression techniques conclusions with regard to all hypotheses. Results are available from the
and corresponding moderation analyses. This yielded qualitatively identical authors upon request.

MIS Quarterly Vol. 45 No. 4 / December 2021 1881


Cichy et al. / Privacy Concerns and Data Sharing in the Internet of Things

2012). Indeed, individuals are often reluctant to share their or unauthorized access to data). That said, our qualitative and
data even under conditions of anonymity, when privacy quantitative studies converge in highlighting the significance of
concerns are unlikely to be at play (Chellappa and Sin 2005; privacy concerns regarding connected cars. Perhaps most
Coy 2001). Hence, extant privacy-centered models are unable importantly, 41% of participants in Study 1 associated
to sufficiently explain the sharing of anonymized and connected cars with a threat to their privacy. The strong privacy
seemingly low-sensitivity data, the kind of data that is of concerns observed across our three studies are, hence, not a
practical relevance to service innovation in the IoT. In this methodological artifact of the priming effect of survey
regard, our qualitative analyses from Study 1 suggest that the instruments. Our work also has implications for a second
perceptions about the costs of data sharing are more diverse methodological debate that encourages privacy researchers to
than previously assumed. In that regard and similar to advance from explaining data sharing intentions to observing
previous findings, we show that individuals exhibit strong actual data sharing behavior (e.g., Belanger and Crossler 2011;
feelings of ownership not only with regard to physical IT Dinev et al. 2015; Smith et al. 2011). As the first attempt to
artifacts (like a computer as in Anderson and Agarwal 2010), examine both outcome variables in the same empirical
but also in terms of the data that those artifacts process (like context—data sharing intentions in Study 2 and data sharing
social network profile information, as in Spiekerman et al. behavior in Study 3—we found that our results converge across
2012). The sharing of personal data—even if less sensitive— the two outcome variables.
might thus be perceived as a loss of psychological ownership
that threatens individuals’ emotional attachment to their data.
Consistent with these arguments, our quantitative findings Implications for Practice and Policy
reveal that feelings of ownership have important behavioral
implications in the context of data sharing decisions— Our research also contributes to the ongoing public
especially when legal ownership is ambiguous, as is still the discussion on how to balance organizations’ interests in
case in many IoT domains (Kohler and Colbert-Taylor 2014; collecting and monetizing personal data with users’ interests
Zmud et al. 2016). Our results also indicate that a deeper in controlling and benefitting from that data. Several IS
understanding of the motives underpinning psychological scholars have called for research to advance the
ownership can inform the design of effective incentives to understanding of privacy and cybersecurity by addressing
compensate individuals for sharing their data. Incentives in these controversial and unresolved issues (Lowry et al.
the form of data-enabled services can be designed so that they 2017) rather than trying to fill gaps in literature without
appeal to the fundamental motives that underpin making any practically relevant contributions (Rai 2017).
psychological ownership (e.g., self-efficacy), letting the act of Indeed, engaged IS scholarship has an important role to play,
data sharing enhance rather than threaten this desired not only in developing a better understanding the perceived
emotional state. and actual vulnerabilities of IoT users, but also in deriving
countermeasures and incentive mechanisms that enable
Third, regarding methods, our study illustrates the value creation in IoT ecosystems. As such, our work helps
complementary nature of qualitative and quantitative research address a practical issue that ranks among the most persistent
methods. Qualitative methods provide context-specific insights barriers to the adoption of connected car services, such as
into salient factors that influence key behaviors or outcomes, telematic insurance, digital driving assistants, and predictive
which can then be measured through quantitative methods. For maintenance (Foley 2017; Jean 2020).
instance, without context-specific understanding through
qualitative analysis, we may not have identified psychological The contextual privacy risk factors that we uncovered offer
ownership as a main determinant of individuals’ willingness to multiple points of departure for managerial action. As our
share driving data. Our mixed methods approach and the findings indicate, building relational trust with users helps to
opportunities for methodological triangulation it provides, alleviate their privacy concerns and ultimately enhances data
however, also put us in the position to contribute to two sharing, the central precondition for connected car services.
methodological debates among privacy scholars. The first Data-requesting stakeholders could signal their integrity and
debate pertains to the potential biases induced by survey studies, reduce perceived risks of data sharing through transparent data
the predominant approach in privacy research. There is growing usage policies, user-friendly privacy settings, and external
awareness among privacy scholars that participants’ exposure guarantees of privacy through privacy seals provided by third-
to a privacy concern scale might trigger such concern in the first party organizations (Rifon et al. 2005; Xu et al. 2011).
place, given the emotional and cognitive processes it initiates Moreover, car manufacturers and technology providers would
(Alashoor et al. 2017). The established multidimensional scales be well-advised to take seriously drivers’ beliefs about
(e.g., Smith et al. 1996; Malhotra et al. 2004) might draw security, which are closely intertwined with privacy concerns
participants’ attention to areas of personal data misuse that they in IoT-related contexts, as our study shows. This will involve
might not have thought about before (e.g., excessive collection embedding resilient security systems into IoT devices,

1882 MIS Quarterly Vol. 45 No. 4 / December 2021


Cichy et al. / Privacy Concerns and Data Sharing in the Internet of Things

informing users of any residual security risks, and educating Limitations and Future Research Directions
them on how best to contain them. For the latter two, the
negative consequences revealed in our interviews represent Our study has theoretical and empirical limitations that
possible starting points for better communication with drivers. offer several meaningful opportunities for future research.
In addition, policy makers could implement certifications that First, as with all self-reported data, our research is exposed
stipulate a sufficient level of data security as a precondition to a possible common method bias, as in our Study 2, which
for the launch of an IoT product. Our analyses of contextual uses data sharing intentions as a dependent variable. We
privacy risk factors also revealed that privacy concerns—and sought to reduce this threat ex ante by implementing
subsequently data sharing—varied notably across the levels of procedural remedies recommended by Podsakoff et al.
perceived data sensitivity. Hence, rather than simply (2003), including guaranteeing the anonymity of survey
depriving car drivers of their privacy and maximizing the participants, providing contextual information and
collection and use of connected car data, service providers definitions to reduce ambiguity, and informing participants
may adopt more selective and tailored approaches to design that there were no right or wrong answers. Moreover,
data-enabled services that stand out from the competition by results from Harman’s single-factor test, the unmeasured
offering the best possible ratio of customer benefit to data latent method construct approach (Liang et al. 2007), and
consumption, especially at the high sensitivity end of the data the specific bias test (Serrano Archimi et al. 2018) jointly
spectrum. Alternatively, sensitive data could be processed suggested that common method bias was unlikely to affect
only locally, i.e., on users’ IoT devices without transferring the results of Study 2. Future research could go a step
the data to the cloud and hence shielding it from further central further and measure data sharing intentions at a later point
storage and analysis (see edge computing, Shi and Dustdar in time using a different instrument.
2016). Our findings also underline the promise of “privacy-
by-design” principles (Cavoukian 2011) when designing and Second, the design of our field study, Study 3, may have
evaluating solutions to create value from the data generated by limited our ability to detect the moderating effect associated
connected cars and other smart and interconnected devices. with increased self-image congruency (Hypothesis 6b). As
outlined above, it might well be that the digital driving
Perhaps most profoundly, our findings reveal that, despite assistant did not allow for a meaningful expression of
strong feelings of ownership, drivers may be willing to share drivers’ self-identity, given its limited integration into the
their data in exchange for appropriate compensation in the form drivers’ daily life (Carter and Grover 2015). We encourage
of offers that address the motives underlying their feelings of IS scholars to refine our study design and further examine
ownership. As illustrated in our Study 3, this compensation can the effect that incentives with high self-image congruency
take the form of a digital driving assistant that helps users learn could have in terms of redirecting the effect of psychological
about their driving style. Rather than attempting to suppress ownership on data sharing.
users’ feelings of ownership toward their data, stakeholders in
the connected car and broader IoT arena could partner with Third, our empirical focus on the data sharing decisions of car
users in an open innovation environment to design services that drivers in Germany, who are known to be particularly
appeal to drivers’ self-identity, enhance perceptions of self- hesitant to share data with connected cars (McKinsey, 2016),
efficacy, and further enforce drivers’ psychological ownership allowed for local theorizing and interpretation (Chiasson and
regarding the data that their connected vehicles and other IoT Davidson 2005) but might constrain the generalizability of
devices generate. As a case in point, it would be interesting to our empirical findings (Seddon and Scheepers 2011). Future
consider how gamification elements could further enhance the replication studies could seek to establish the statistical
experience of self-efficacy (Hamari et al. 2014) and be generalizability of our empirical findings to other
integrated in services such as the driving style assistant geographical settings that differ in terms of cultural beliefs,
employed in our Study 3. We believe that psychological prevalent attitudes toward possession, and privacy norms
ownership will gain in relevance as policy makers seek to (Dinev et al. 2006; Lowry et al. 2011). However, we expect
empower users and provide them with greater control over the our theoretical model to be generalizable to a broad set of
collection, storage, and use of their data. The 2018 reform of smart, connected devices in the IoT that generate continuous
European Union’s (EU) General Data Protection Rules streams of data as by-products of everyday use (Porter and
(GDPR) has empowered users and strengthened their Heppelmann 2014).
bargaining position in the emerging data economy. Overall, we
hope that our findings will help practitioners and policy makers Lastly, mixed methods research is associated with additional
better understand drivers’ concerns about sharing driving data, sources of bias that pose a threat to the inference quality of
placing organizations in a better position to recognize their such study designs. One of these limitations results from
digital responsibility and design technologies, services, and making statistical generalizations from a sample population to
policies centered on users’ privacy and ownership needs. a larger population (Onwuegbuzie and Johnson 2006). It is

MIS Quarterly Vol. 45 No. 4 / December 2021 1883


Cichy et al. / Privacy Concerns and Data Sharing in the Internet of Things

possible that our interview participants and the participants Acknowledgments


recruited for our quantitative Studies 2 and 3 were not
representative of the population. This would limit the We are grateful to the senior editor, Susan Brown, the associate
transferability of findings between studies and would affect editor, and three anonymous reviewers for their valuable
the meta-inferences that emerge from our mixed methods feedback and guidance throughout the review process. We thank
design. However, the concerns associated with the integration participants of the 35th International Conference on Information
of multiple samples are mitigated as our data are from a Systems, the 2015 Innovation Research Seminar in Aachen, and
participants of research seminars at the University of Melbourne
relatively large number of interviewees (n=120) and survey
and University of Wollongong for their helpful comments and
respondents (>300). Study participants were recruited through suggestions on earlier versions of this paper. We also thank
the same channels and exhibited similar sociodemographic Rodrigo Barboza and team for the cooperation, inspiring
characteristics across all our studies. A second potential discussion, and technical support. Finally, we acknowledge the
limitation of our mixed methods design pertains to a possible excellent research assistance of Edda Glase.
bias that originates from the sequence in which our studies
were conducted (Onwuegbuzie and Johnson 2006). It is
possible that some of our findings are dependent on the order References
in which the studies were administered and that our
interpretation could be different if the order of the quantitative Acquisti, A., Taylor, C. R. and Wagman, L. 2016. “The Economics
and qualitative phases had been reversed. of Privacy,” Journal of Economic Literature (52:2), pp. 442-492.
Acquisti, A., Brandimarte, L., and Loewenstein, G. 2015. “Privacy
and Human Behavior in the Age of Information,” Science
(347:6221), pp. 509-514.
Conclusion Acquisti, A., John, L. K., and Loewenstein, G. 2012. “The Impact of
Relative Standards on the Propensity to Disclose,” Journal of
Marketing Research (49:2), pp. 160-174.
The connected car represents a critical stepping stone toward Adjerid, I., Acquisti, A., and Loewenstein, G. 2019. “Choice
the vision of a fully interconnected and enriched experience of Architecture, Framing, and Cascaded Privacy Choices,”
personal mobility. As a significant IS artifact in consumer IoT, Management Science (65:5), pp. 2267-2290.
the connected car simultaneously presents substantial Agarwal, R., Gao, G., DesRoches, C., and Jha, A. 2010. “Research
opportunities for data-enabled value creation and Commentary—The Digital Transformation of Healthcare:
unprecedented threats to drivers’ informational and physical Current Status and the Road Ahead,” Information Systems
Research (21:4), pp. 796-809.
privacy. This calls for refined models of data sharing that
Ågerfalk, P. J. 2013. “Embracing diversity through mixed methods
account for the emerging realities of the connected car and the research,” European Journal of Information Systems (22:3), pp.
broader IoT. Against this backdrop, we conducted three 251-256.
complementary studies to develop and validate a conceptual Ajzen, I. 1991. “The Theory of Planned Behavior,” Organizational
model based on the interplay of psychological ownership, Behavior and Human Decision Processes (50:2), pp. 179-211.
privacy concerns, and data sharing. We found evidence that Alashoor, T., Fox, G., and Smith, H. J. 2017. “The Priming Effect of
Prominent IS Privacy Concerns Scales on Disclosure Outcomes:
drivers exhibit feelings of data ownership that reduced their data
An Empirical Examination,” in Proceedings of Pre-ICIS
sharing propensity, both directly and indirectly, because of Workshop on Information Security and Privacy, Seoul, Korea.
heightened concerns about privacy. Privacy concerns, in turn, Altman, I. 1975. The Environment and Social Behavior: Privacy,
were shaped by relational trust toward the data-requesting party, Personal Space, Territory, and Crowding, Brooks/Cole
perceptions of data security, and the perceived sensitivity of the Publishing.
data requested. We demonstrated that drivers with high Anderson, C. L., and Agarwal, R. 2010. “Practicing Safe Computing:
psychological ownership were willing to share their driving A Multimedia Empirical Examination of Home Computer User
Security Behavioral Intentions,” MIS Quarterly (34:3), pp. 613-
data if they were offered adequate compensation in exchange. 643.
We found that incentives that appealed to the motives of Anderson, C. L., and Agarwal, R. 2011. “The Digitization of
psychological ownership (enhancing self-efficacy or Healthcare: Boundary Risks, Emotion, and Consumer
contributing to self-identity) alleviated the negative effect that Willingness to Disclose Personal Health Information,”
psychological ownership had on individuals’ decisions to share Information Systems Research (22:3), pp. 469-490.
personal data. Our conceptual model and empirical findings Awad, N. F., and Krishnan, M. S. 2006. “The Personalization Privacy
Paradox: An Empirical Evaluation of Information Transparency
promise to promote data sharing with adequate compensation and the Willingness to be Profiled Online for Personalization,”
of drivers and to encourage future privacy research in the IoT MIS Quarterly (30:1), pp. 13-28.
context. Only then will privacy research be able to keep up with Bahirat, P., Y. He, A. Menon, and B. Knijnenburg. 2018. “A Data-
the technological advances and contribute to sustained value Driven Approach to Developing IoT Privacy-Setting Interfaces,”
creation in the IoT era.

1884 MIS Quarterly Vol. 45 No. 4 / December 2021


Cichy et al. / Privacy Concerns and Data Sharing in the Internet of Things

in Proceedings of the 23rd International Conference on Choi, H., and Levine, J. M. 2004. “Minority Influence in Work
Intelligent User Interfaces. Teams: The Impact of Newcomers,” Journal of Experimental
Bansal, G., Zahedi, F. M., and Gefen, D. 2010. “The Impact of Social Psychology (40), pp. 273-280.
Personal Dispositions on Information Sensitivity, Privacy Coy, K. 2001. “The Current Privacy Environment: Implications for
Concern and Trust in Disclosing Health Information Online,” Third-Party Research,” Journal of Continuing Education in the
Decision Support Systems (49), pp. 138-150. Health Professions (21:4), pp. 203-214.
Bartunek, J. M. 1993. “Rummaging Behind the Scenes of Crossler, R.E., and Bélanger, F. 2019. “Why Would I Use Location-
Organizational Change and Finding Role Transitions, Illness, Protective Settings on My Smartphone? Motivating Protective
and Physical Space,” Research in Organizational Change and Behaviors and the Existence of the Privacy Knowledge-Belief
Development (7:1), pp. 41-76. Gap,” Information Systems Research (30:3), pp. 995-1006.
Basi, R. K. 1999. “WWW Response Rates to Socio-Demographic Culnan, M. J. 1993. “How Did They Get My Name? An Exploratory
Items,” Journal of the Market Research Society (4:41), pp. 397- Investigation of Consumer Attitudes Toward Secondary
401. Information Use,” MIS Quarterly (17:3), pp. 341-363.
Belanger, F., and Crossler, R. E. 2011. “Privacy in The Digital Age: Culnan, M. J., and Armstrong, P. K. 1999. “Information Privacy
A Review of Information Privacy Research in Information Concerns, Procedural Fairness, and Impersonal Trust: An
Systems,” MIS Quarterly (35:4), pp. 1017-1036. Empirical Investigation,” Organization Science (10:1),
Belanger, F., Hiller, J. S., and Smith, W. J. 2002. “Trustworthiness in pp. 104-115.
Electronic Commerce: The Role of Privacy, Security, and Site de Montjoye, Y.-A., C. A. Hidalgo, M. Verleysen, and V. D. Blondel.
Attributes,” Journal of Strategic Information Systems (11:3-4), 2013. “Unique in the Crowd: The Privacy Bounds of Human
pp. 245-270. Mobility,” Scientific Reports (3), pp. 1-5.
Belk, R. W. 2010. “Sharing,” Journal of Consumer Research (36:5), Dinev, T., McConnell, A. R., and Smith, H. J. 2015. “Research
pp. 715-734. Commentary—Informing Privacy Research Through
Belk, R. W. 1988. “Possessions and the Extended Self,” Journal of Information Systems, Psychology, and Behavioral Economics:
Consumer Research (15:2), pp. 139-168. Thinking Outside the “APCO” Box,” Information Systems
Bergkvist, L., and Rossiter, J. R. 2007. “The Predictive Validity of Research (26:4), pp. 639-655.
Multiple-Item Versus Single-Item Measures of the Same Dinev, T., and Hart, P. 2006. “An Extended Privacy Calculus Model
Constructs,” Journal of Marketing Research (44:2), pp. 175-184. for E-Commerce Transactions,” Information Systems Research
Beverungen D, Müller O, Matzner M, Mendling J, and von Brocke (17:1), pp. 61-80.
J. 2019.“Conceptualizing Smart Service Systems,” Electronic Dinev, T., Bellotto, M., Hart, P., Russo, V., Serra, I., and Colautti, C.
Markets (29:1), pp. 7-18. 2006. “Privacy Calculus Model in E-Commerce: A Study of Italy
Brown, G., Lawrence, T. B., and Robinson, S. L. 2005. “Territoriality and the United States,” European Journal of Information Systems
in Organizations,” Academy of Management Review (30:3), pp. (15:4), pp. 389-402.
577-594. Dittmar, H. 1992. The Social Psychology of Material Possession: To
Byrne, N.M. 2016. Structural Equation Modeling With AMOS, 3rd Have Is to Be, St. Martin Press.
new ed., Routledge. Eady, T.A. 2019. “Tesla’s Deep Learning at Scale: Using Billions of
Carter, M., and Grover, V. (2015) “Me, My Self, and I(T): Miles to Train Neural Networks by,” TDataScience
Conceptualizing Information Technology (IT) Identity and Its (https://towardsdatascience.com/teslas-deep-learning-at-scale-
Implications,” MIS Quarterly (39:4), pp. 931-957. 7eed85b235d3).
Cavoukian, A. 2011. “Privacy by Design in Law, Policy and Practice: Flick, U. 2014. An Introduction to Qualitative Research, 5th ed.,
A White Paper for Regulators, Decision-makers and Policy- SAGE.
makers,” Information and Privacy Commissioner, Ontario, Foley 2017. “2017 Connected Cars & Autonomous Vehicles Survey”
Canada (http://www.ontla.on.ca/library/repository/mon/25008/ (https://www.foley.com/files/uploads/2017-Connected-Cars-
312239.pdf). Survey-Report.pdf)
Charmaz, K. 2006. Constructing Grounded Theory: A Practical Formanek, R. 1991. “Why They Collect: Collectors Reveal Their
Guide Through Qualitative Analysis, SAGE. Motivations. To Have Possessions: A Handbook on Ownership
Chellappa, R. K., and Sin, R. G. 2005. “Personalization Versus and Property,” Journal of Social Behavior and Personality (6:6),
Privacy: An Empirical Examination of the Online Consumer’s pp. 275-286.
Dilemma,” Information Technology and Management (6:2), pp. Fowler, G. A. 2020. “My Car Was in a Hit-And-Run. Then I Learned
181-202. it Recorded the Whole Thing,” The Washington Post
Chen, R. 2013. “Living a Private Life in Public Social Networks: An (https://www.washingtonpost.com/technology/2020/02/27/tesla
Exploration of Member Self-Disclosure,” Decision Support -sentry-mode/
Systems (55:3), pp. 661-668. Furby, L. 1980. “The Origins and Early Development of Possessive
Chiasson, M. W., and Davidson, E. 2005. “Taking Industry Seriously Behavior,” Political Psychology (2:1), pp. 30-42.
in Information Systems Research,” MIS Quarterly (29:4), pp. Gandy, O. 1993. The Panoptic Sort: A Political Economy of Personal
591-605. Information, Westview.
Childers, T. L., Carr, C. L., Peck, J., and Carson, S. (2001), “Hedonic Gelman, A., Carlin, J. B., Stern, H. S., and Rubin, D. B. 2004.
and Motivations for Online Retail Shopping Behavior,” Journal Bayesian Data Analysis. Chapman & Hall/CRC.
of Retailing (77:4), pp. 511-535. Haberlandt, U., and Sester, M. 2010. “Areal Rainfall Estimation
Using Moving Cars as Rain Gauges: A Modeling Study,”
Hydrology and Earth System Sciences (14), pp. 1139-1151.

MIS Quarterly Vol. 45 No. 4 / December 2021 1885


Cichy et al. / Privacy Concerns and Data Sharing in the Internet of Things

Hair, J. F. Jr., Anderson, R. E., Tatham, R. L. and Black, W. C. 1995. Kohler, W.J. and Colbert-Taylor, A. 2014. “Current law and potential
Multivariate Data Analysis, 3rd ed., Macmillan. legal issues pertaining to automated, autonomous and connected
Hamari, J., Koivisto, J., and Sarsa, H. 2014. “Does Gamification vehicles,” St. Clara High Technology Law Journal (31:1),
Work? A Literature Review of Empirical Studies on Article 3.
Gamification,” 47th Hawaii International Conference on System Korman, A. K. 1970. “Toward an Hypothesis of Work Behavior,”
Sciences, pp. 3025-3034. Journal of Applied Psychology (54:1), pp. 31-43.
Hann, I.H., Kai-Lung Hui, Sang-Yong Tom Lee & Ivan P.L. Png. Krasnova, H., Spiekermann, S., Koroleva K., Hildebrand, T. 2010.
2007. “Overcoming Online Information Privacy Concerns: An “Online Social Networks: Why We Disclose,” Journal of
Information-Processing Theory Approach,” Journal of Information Technology (25:2), pp. 109-125.
Management Information Systems (24:2), pp. 13-42. Landwehr, C., Boneh, D., Mitchell, J. C., Bellovin, S. M., Landau,
Hong, W., Chan, F. K. Y., Thong, J. Y. L., Chasalow, L. C., and S., and Lesk, M. E. 2012. “Privacy and cybersecurity: The next
Dhillon, G. 2014. “A Framework and Guidelines for Context- 100 years,” Proceedings of the IEEE 100 (Special Centennial
Specific Theorizing in Information Systems Research,” Issue), pp. 1659-1673.
Information Systems Research (25:1), pp. 111-136. Liang, H., Saraf, N., Hu, Q., and Xue, Y. 2007. “Assimilation of
Hu, L.-T., and Bentler, P. M. 1999. “Cutoff criteria for fit indexes in Enterprise Systems: The Effect of Institutional Pressures and the
covariance structure analysis: Conventional criteria versus new Mediating Role of Top Management,” MIS Quarterly (31:1), pp.
alternatives,” Structural Equation Modeling (6:1), pp. 1-55. 59-87.
Hui, K. L., Tan, B. C. Y., and Goh, C. Y. 2006. “Online Information Lim, H., and Taeihagh, A. 2018. “Autonomous Vehicles for Smart
Disclosure: Motivators and Measurements,” ACM Transactions and Sustainable Cities: An In-Depth Exploration of Privacy and
on Internet Technology (6:4), pp. 415-441. Cybersecurity Implications,” Energies (11:5), Article 1062.
Hui, K., Teo, H. H., and Lee, S. 2007. “The Value of Privacy Li, Y. 2011. “Empirical Studies on Online Information Privacy
Assurance: An Exploratory Field Experiment,” MIS Quarterly Concerns: Literature Review and an Integrative Framework,”
(31:1), pp. 19-33. Communications of Association for Information Systems (28),
James, W. 1890. The Principles of Psychology, Holt. pp. 453-496.
Jean, A. 2020. “Otonomo-SBD Automotive European Consumer Li, Y. 2012. “Theories in Online Information Privacy Research: A
Survey Reveals Solid Interest in Connected Car Services and Critical Review and an Integrated Framework,” Decision
Limited GDPR Understanding,” Otonomo (https://otonomo.io/ Support Systems (54), pp. 471-481.
press_releases/otonomo-sbd-automotive-european-consumer- Lindell, M. K., and Whitney, D. J. 2001. “Accounting for Common
survey/). Method Variance in Cross-Sectional Research Designs,” Journal
Jiang, Z., Heng, C. S., and Choi, B. C. F. 2013. “Privacy Concerns of Applied Psychology (86:1), pp. 114-121.
and Privacy-Protective Behavior in Synchronous Online Social Lowry, P. B., Dinev, T., and Willison, R. 2017. “Why Security and
Interactions,” Information Systems Research (24:3), pp. 579-595. Privacy Research Lies at the Centre of the Information Systems
Johns, G. 2006. “The Essential Impact of Context on Organizational (IS) Artefact: Proposing a Bold Research Agenda,” European
Behavior,” Academy of Management Review, (31:2), pp. 386- Journal of Information Systems (26:6), pp. 546-563.
408. Lowry, P. B., Cao, J., and Everard, A. 2011. “Privacy Concerns
Johnson, R. B., and Onwuegbuzie, A. J. 2004. “Mixed Methods Versus Desire for Interpersonal Awareness in Driving the Use of
Research: A Research Paradigm Whose Time Has Come,” Self-Disclosure Technologies: The Case of Instant Messaging in
Educational Researcher (33:7), pp. 14-26. Two Cultures,” Journal of Management Information Systems
Karwatzki, S., Trenz, M., Tuunainen, V. K., and Veit, D. 2017. (27:4), pp. 163-200.
“Adverse Consequences of Access to Individuals’ Information: Luhmann, N. 1979. Trust and Power, Wiley.
An Analysis of Perceptions and the Scope of Organisational Reyna, V. F. 2004. “How People Make Decisions that Involve Risk,”
Influence,” European Journal of Information Systems (26:6), pp. Current Directions in Psychological Science (13:2), pp. 60-66.
688-715. Malhotra, N. K., Kim, S., and Agarwal, R. 2004. “Internet Users’
Kehr, F., Kowatsch, T., Wentzel, D., and Fleisch, E. 2015. “Blissfully Information Privacy Concerns (IUIPC): The Construct, the
Ignorant: The Effects of General Privacy Concerns, General Scale, and a Causal Model,” Information Systems Research
Institutional Trust, and Affect in the Privacy Calculus,” (15:4), pp. 336-355.
Information Systems Journal (25:6), pp. 607-635. Margulis, S. T. 2003. “Privacy as a Social Issue and Behavioral
Keith, M. J., Thompson, S. C., Hale, J. E., and Greer, C. 2012. Concept,” Journal of Social Issues (59:2), pp. 243-261.
“Examining the Rationality of Information Disclosure through Marsh, H. W., Wen, Z., & Hau, K.-T. 2004. “Structural Equation
Mobile Devices,” in Proceedings of the 33rd International Models of Latent Interactions: Evaluations of Alternative
Conference on Information Systems, Orlando, FL. Estimation Strategies and Indicator Construction,“
Kelly, G. 2019. “Compare the Privacy Practices of the Most Popular Psychological Methods (9:3), pp. 275−300.
Smart Speakers with Virtual Assistants,” in Common Sense Marwick, A. E., Murgia-Diaz, D., and Palfrey, J. G. 2010. “Youth,
Education (https://www.commonsense.org/education/articles/ Privacy and Reputation,” Harvard Public Law Working Paper
compare-the-privacy-practices-of-the-most-popular-smart- (10-29), (available at https://papers.ssrn.com/sol3/papers.cfm?
speakers-with-virtual-assistants). abstract_id=1588163#
Kim, D. J., Ferrin, D. L., and Rao, H. R. 2008. “A Trust-Based McKinsey. 2016, Monetizing Car Data: New Service Business
Consumer Decision-Making Model in Electronic Commerce: Opportunities to create new customer benefits,
The Role of Trust, Perceived Risk, and their Antecedents,” (https://www.mckinsey.com/~/media/mckinsey/industries/auto
Decision Support Systems (44:2), pp. 544-564.

1886 MIS Quarterly Vol. 45 No. 4 / December 2021


Cichy et al. / Privacy Concerns and Data Sharing in the Internet of Things

motive%20and%20assembly/our%20insights/monetizing%20ca Rai, A. 2016. “Synergies Between Big Data and Theory,” MIS
r%20data/monetizing-car-data.ashx) Quarterly, (40:2), pp. iii-ix.
Moon, Y. 2000. “Intimate Exchanges: Using Computers to Elicit Rai, A. 2017 “Avoiding Type III Errors: Formulating IS Research
Self-Disclosure from Consumers,” Journal of Consumer Problems that Matter,” MIS Quarterly (41:2), iii-vii.
Research (26:4), pp. 323-339. Rifon, N. J., LaRose, R., and Choi, S. M. 2005. “Your Privacy Is
Mothersbaugh, D. L., Foxx, W. K., Beatty, S. E., and Wang, S. 2012. Sealed: Effects of Web Privacy Seals on Trust and Personal
“Disclosure Antecedents in an Online Service Context: The Role Disclosures,” Journal of Consumer Affairs (39:2), pp. 339-362.
of Sensitivity of Information,” Journal of Service Research Rosenthal, R., and Rosnow, R. L. 2006. Essentials of Behavioral
(15:1), pp. 76-98. Research: Methods and Data Analysis, Academic Internet
NHTSA 2018. NHTSA and Vehicle Cybersecurity, Publishers.
(https://www.nhtsa.gov/sites/nhtsa.dot.gov/files/documents/nhts Schau, H. J., and Gilly, M. C. 2003. “We Are What We Post? Self-
avehiclecybersecurity2016.pdf). Presentation in Personal Web Space,” Journal of Consumer
Nowak, G. J., and Phelps, J. E. 1992. “Understanding Privacy Research (30:3), pp. 385-404.
Concerns: An Assessment of Consumers’ Information-Related Seddon, P. B., and Scheepers, R. 2011. “Towards the Improved
Knowledge and Beliefs,” Journal of Direct Marketing (6:4), pp. Treatment of Generalization of Knowledge Claims in IS
28-39. Research: Drawing General Conclusions from Samples,”
Nunnally, J. C. 1978. Psychometric Theory, McGraw-Hill. European Journal of Information Systems (21:1), pp. 6-21.
Onwuegbuzie, A. J., and Johnson, R. B. 2006. “The Validity Issue in Seiberth, G., Gruendinger, W. 2018. “Data-driven Business Models
Mixed Research,” Research in the Schools (13:1), pp. 48-63. in Connected Cars, Mobility Services and Beyond,” BVDW
Parks, R., H. Xu, C.-H. Chu & P. B. Lowry. 2017. “Examining the Research No. 01/18 (https://www.bvdw.org/fileadmin/
Intended and Unintended Consequences of Organisational user_upload/20180509_bvdw_accenture_studie_datadrivenbusi
Privacy Safeguards,” European Journal of Information Systems nessmodels.pdf).
(26:1), pp. 37-65. Serrano Archimi, C., Reynaud, E., Yasin, H.M. and Bhatti, Z.A.
Peck, J., and Shu, S. B. 2009. “The Effect of Mere Touch on 2018. “How perceived corporate social responsibility affects
Perceived Ownership,” Journal of Consumer Research (36:3), employee cynicism: the mediating role of organizational trust”,
pp. 434-447. Journal of Business Ethics (151:4), pp. 907-921.
Phelps, J., Nowak, G., and Ferrell, E. 2000. “Privacy Concerns and Sheng, H., Nah, F. F.-H., and Siau, K. 2008. “An Experimental Study
Consumer Willingness to Provide Personal Information,” on Ubiquitous Commerce Adoption: Impact of Personalization
Journal of Public Policy & Marketing (19:1), pp. 27-41. and Privacy Concerns,” Journal of the Association for
Pierce, J. L., Jussila, I., and Cummings, A. 2009. “Psychological Information Systems (9:6), pp. 344-376.
Ownership Within the Job Design Context: Revision of the Job Shi, W., and Dustdar, S. 2016. “The Promise of Edge Computing,”
Characteristics Model,” Journal of Organizational Behavior Computer (49:5), pp. 78-81.
(30:4), pp. 477-496. Shim, J., Sharda, R., French, A. M., Syler, R. A., and Patten, K. P.
Pierce, J. L., Kostova, T., and Dirks, K. T. 2001. “Toward a Theory 2020. “The Internet of Things: Multi-Faceted Research
of Psychological Ownership in Organizations,” Academy of Perspectives,” Communications of the Association for
Management Review (26:2), pp. 298-310. Information Systems (46), pp. 511-536
Pierce, J. L., Kostova, T., and Dirks, K. T. 2003. “The State of Sirgy, M. J., Grewal, D., Mangleburg, T. F., Park, J., Chon, K.,
Psychological Ownership: Integrating and Extending a Century Claiborne, C. B., et al. 1997. “Assessing the Predictive Validity
of Research,” Review of General Psychology (7:1), pp. 84-107. of Two Methods of Measuring Self-image congruence,” Journal
Podsakoff, P. M., MacKenzie, S. B., Lee, J. Y., and Podsakoff, N. P. of the Academy of Marketing Science (25:3), pp. 229-241.
2003. “Common Method Biases in Behavioral Research: A Smith, H. J., Dinev, T., and Xu, H. 2011. “Information Privacy
Critical Review of the Literature and Recommended Remedies,” Research: An Interdisciplinary Review,” MIS Quarterly (35:4),
Journal of Applied Psychology (5:88), pp. 879-903. pp. 989-1015.
Porter, M. E., and Heppelmann, J. E. 2014. “How Smart, Connected Smith, H. J., Milberg, S. J., and Burke, S. J. 1996. “Information
Products Are Transforming Competition,” Harvard Business Privacy: Measuring Individuals’ Concerns about Organizational
Review (92), pp. 11-64. Practices,” MIS Quarterly (20:2), pp. 167-196.
Poudel, S. 2016. “Internet of Things: Underlying Technologies, Son, J. Y., and Kim, S. S. 2008. “Internet Users’ Information Privacy-
Interoperability, and Threats to Privacy and Security,” Berkeley Protective Responses: A Taxonomy and a Nomological Model,”
Technology Law Journal (31:2), pp. 2-27. MIS Quarterly (32:3), pp. 503-529.
Preacher, K. J., and Hayes, A. F. 2004. “SPSS and SAS Procedures Spiekermann, S., Korunovska, J., and Bauer, C. 2012. “Psychology
for Estimating Indirect Effects in Simple Mediation Models,” of Ownership and Asset Defense: Why People Value Their
Behavior Research Methods, Instruments, & Computers (36:4), Personal Information Beyond Privacy,” (http://epub.wu.ac.at/
pp. 717-731. 3630/).
Price, L., Arnould, E., and Curasi, C. F. 2000. “Older Consumers’ Srivastava, S. C., and Chandra, S. 2018. “Social presence in virtual
Disposition of Special Possessions,” Journal of Consumer world collaboration: An uncertainty reduction perspective using
Research (27:2), pp. 179-201. a mixed methods approach,” MIS Quarterly (42:3), pp. 779-803.
Raff, S.; Wentzel, D.; Obwegeser, N. (2020): “Smart Products: Stewart, K. A., and Segars, A. H. 2002. “An Empirical Examination
Conceptual Review, Synthesis, and Research of the Concern for Information Privacy Instrument,” Information
Directions,” Journal of Product Innovation Management, Systems Research (13:1), pp. 36-49.
forthcoming.

MIS Quarterly Vol. 45 No. 4 / December 2021 1887


Cichy et al. / Privacy Concerns and Data Sharing in the Internet of Things

Stone, E. F., Gueutal, H. G., Gardner, D. G., and McClure, S. 1983. Xu, H., Teo, H.-H., Tan, B. C. Y., and Agarwal, R. 2012. “Research
“A Field Experiment Comparing Information-Privacy Values, Note —Effects of Individual Self-Protection, Industry Self-
Beliefs, and Attitudes Across Several Types of Organizations,” Regulation, and Government Regulation on Privacy Concerns:
Journal of Applied Psychology (68:3), pp. 459-468. A Study of Location-Based Services,” Information Systems
Stone, E. F., and Stone, D. L. 1990. “Privacy in Organizations: Research (23:4), pp. 1342-1363.
Theoretical Issues, Research Findings, and Protection Yun, H, Lee, G, Kim, D.J. 2019. “A Chronological Review of
Mechanisms,” Research in Personnel and Human Resources Empirical Research on Personal Information Privacy Concerns:
Management (8), pp. 349-411. An Analysis of Situational Contexts and Research Constructs,”
Tax, S. S., Brown, S. W., and Chandrashekaran, M. 1998. Information & Management (56:4), pp. 570-601
“Customer Evaluations of Service Complaint Experiences: Zhao, X., Lynch, J., and Chen, Q. 2010. “Reconsidering Baron and
Implications for Relationship Marketing,” Journal of Kenny: Myths and Truths about Mediation Analysis,” Journal
Marketing (62), pp. 60-76. of Consumer Research (37), pp.197-206.
Vaia, G., Carmel, E., DeLone, W., Trautsch, H., and Menichetti, F. Zmud, J., Tooley, M., and Miller, M. 2016. Data Ownership Issues
2012. “Vehicle Telematics at an Italian Insurer: New Auto in a Connected Car Environment: Implications for State and
Insurance Products and a New Industry Ecosystem,” MIS Local Agencies, Texas A&M Transportation Institute.
Quarterly Executive (11:3), pp. 113-125.
Venkatesh, V., Brown, S., and Bala, H. 2013. “Bridging the
Qualitative-Quantitative Divide: Guidelines for Conducting
Mixed Methods Research in Information Systems,” MIS
About the Authors
Quarterly (37:1), pp. 21-54.
Venkatesh, V., Brown, S. A., and Sullivan, Y. W. 2016. Patrick Cichy is a professor at the Institute for Applied Data
"Guidelines for Conducting Mixed-methods Research: An Science & Finance at Bern University of Applied Sciences,
Extension and Illustration," Journal of the Association for Switzerland. He received his Ph.D. in technology management
Information Systems (17:7), pp. 435-495. from RWTH Aachen University, Germany. During his studies, he
von Solms, R., and van Niekerk, J. 2013. „From information was a visiting scholar at the School of Computing, National
security to cybersecurity,” Computers & Security, (38:0), pp. University of Singapore. In the past years, Patrick has published
97-102. articles in international scientific journals and contributions in
Ward, S., Bridges, K., and Chitty, B. 2005. “Do Incentives Matter? proceedings of leading international conferences. His work has
An Examination of On‐Line Privacy Concerns and Willingness been accepted for publication in outlets such as MIS Quarterly,
to Provide Personal and Financial Information,” Journal of Electronic Markets, R&D Management, Health Care Management
Marketing Communications (11:1), pp. 21-40. Review, and Proceedings of the International Conference on
Webster, J., Brown, G., Zweig, D., Connelly, C., Brodt, S., and Information Systems.
Sitkin, S. 2008. “Beyond Knowledge Sharing: Knowledge
Hiding and Hoarding at Work,” Research in Personnel and
Human Resources Management (27), pp. 1-37. Torsten Oliver Salge is a professor and co-director of the Institute
Westin, A. 1967. Privacy and Freedom., New York: Atheneum. for Technology and Innovation Management (TIM) within the TIME
White, R. W. 1959. “Motivation Reconsidered: The Concept of Research Area at RWTH Aachen University, Germany. Oliver is also
Competence,” Psychological Review (66), pp. 297-331. a fellow of Cambridge Digital Innovation, University of Cambridge.
Wired.com. 2015. “Hackers Remotely Kill a Jeep on the He received his PhD from the University of Cambridge and has held
Highway—With Me in It,” (visiting) appointments at universities in Auckland, Buenos Aires,
(https://www.wired.com/2015/07/hackers-remotely-kill-jeep- Bochum, Cambridge, Duisburg, Oxford, and Philadelphia. Oliver’s
highway/). work has been published in Academy of Management Review,
Wunderlich, P., Veit, D., and Sarker, S. 2019. “Adoption of Information Systems Research, Journal of Applied Psychology,
Sustainable Technologies: A Mixed-Methods Study of German Journal of Management, Journal of Product Innovation
Households,” MIS Quarterly (43:2), pp. 673-691. Management, Journal of Service Research, MIS Quarterly, and
Xiao, B., and Benbasat, I. 2007. “E-Commerce Product Research Policy, among other outlets. He currently serves as a
Recommendation Agents: Use, Characteristics, and Impact,” associate editor for Information Systems Research.
MIS Quarterly (31:1), pp. 137-209.
Xu, H., Dinev, T., Smith, H. J., and Hart, P. 2008. “Examining the
Formation of Individual’s Privacy Concerns: Toward an Rajiv Kohli is the John N. Dalton Memorial Professor of Business
Integrative View,” in Proceedings of the 29th International at the Raymond A. Mason School of Business, William & Mary.
Conference on Information Systems, Paris, France. Prior to joining academia, he was a project leader in Decision
Xu, H., Dinev, T., Smith, H. J., and Hart, P. 2011.”Information Support Services at Trinity Health. His research has been published
Privacy Concerns: Linking Individual Perceptions with in MIS Quarterly, Management Science, Information Systems
Institutional Privacy Assurances,” Journal of the Association Research, Journal of Management Information Systems, MIS
for Information Systems (12:12), pp. 798-824. Quarterly Executive, and Journal of Operations Management,
Xu, H., Teo, H.-H., Tan, B. C. Y., and Agarwal, R. 2009. “The Role among other journals. He is a co-author of the book The IT Payoff:
of Push-Pull Technology in Privacy Calculus: The Case of Measuring Business Value of Information Technology Investment,
Location-Based Services,” Journal of Management published by Financial Times Prentice-Hall. He currently serves as
Information Systems (26:3), pp. 135-174. a senior editor for Information Systems Research.

1888 MIS Quarterly Vol. 45 No. 4 / December 2021


Cichy et al. / Privacy Concerns and Data Sharing in the Internet of Things

Appendix
Table A1. Goodness-of-Fit Assessments for Covariance-Based SEM (Study 2)
Fit index Description Measurement model Structural model
CHI/df Fit-function-based index 1.528 1.639
RMSEA Error of approximation index 0.040 0.044
GFI Goodness-of-fit index 0.938 0.913
Incremental index with no correction for
CFI model complexity
0.988 0.974
Incremental index with adjustment for
TLI model complexity
0.985 0.967
Standardized Root Mean Square
SRMR Residual
0.0368 0.0401

Table A2. Reliability, Convergent Validity, and Discriminant Validity Test Results of the Constructs (Study 2)
Interconstruct
Standard Cronbach’s Composite
Mean AVE correlation*
deviation alpha reliability
1 2
1. Privacy concern 4.12 1.27 0.95 0.89 0.66 0.81
2. Psychological
4.33 1.88 0.92 0.92 0.73 0.68 0.85
ownership
Note: *Diagonal cells represent the square root of AVE of the respective construct

Table A3. Reliability, Convergent Validity, and Discriminant Validity Test Results of the Constructs (Study 3)
Standard Cronbach’s Composite Interconstruct correlation*
Mean AVE
deviation alpha reliability 1 2 3 4 5 6

1. Privacy Concern 4.37 1.35 0.95 0.87 0.64 0.80

2. Psychological
4.73 1.66 0.92 0.91 0.73 0.59 0.85
Ownership

3. Data Security 4.74 1.49 0.89 0.88 0.64 -0.70 -0.45 0.80

4. Trust 4.75 1.22 0.83 0.83 0.71 -0.46 -0.22 -0.39 0.84

5. Self-Efficacy
3.65 1.54 0.92 0.92 0.75 -0.01 0.04 -0.13 0.16 0.87
Enhancement
6. Self-image
2.85 1.46 0.76 0.77 0.63 -0.04 0.04 -0.21 0.17 0.82 0.80
Congruency
Note: *Diagonal cells represent the square root of AVE of the respective construct.

MIS Quarterly Vol. 45 No. 4 / December 2021 1889


Table A4. Construct Measurements
Construct Measurement items Source
Privacy concerns
CFIP: Collection It usually bothers me when the company asks me for personal driving data. Smith et al. 1996
When the company asks me for personal driving data, I sometimes think
twice before providing it.
It bothers me to give personal driving data to the company.
I am concerned that the company is collecting too much information about
me.
CFIP: Unauthorized I am concerned that the company may not devote enough time and effort to
access preventing unauthorized access to personal driving data.
I am concerned that computer databases that contain personal driving data
may not be well protected from unauthorized access.
I am concerned that the company may not take measures to prevent
unauthorized access to personal driving data.
CFIP: Error I am concerned that all the personal driving data in computer databases
may not be double-checked for accuracy.
I am concerned that the company may not take steps to make sure that my
personal driving data in their databases is accurate.
I am concerned that the company may not well establish the procedures to
correct errors in personal driving data.
I am concerned that the company may not devote time and effort to
verifying the accuracy of personal driving data in their database.
CFIP: Secondary use I am concerned that the company may use the personal driving data for
other purposes without notifying me or getting my authorization.
I am concerned that the company may sell personal driving data in its
databases to other companies.
When I give personal driving data to the company for the use of its
services, I am concerned that the company may use the information for
other purposes.
I am concerned that the company may share personal driving data with
other parties without getting my authorization.
Psychological This is my driving data. Pierce et al. 2001
ownership
I feel a high degree of personal ownership for this driving data.
I sense that this is my driving data.
I feel that this driving data belongs to me.

Data Security I am critical about the data security embedded in this technology. (reverse Author-developed
coded)
I believe that hackers can easily access to driving data from this
technology. (reverse coded)
I doubt that driving data is secure from unauthorized access in this
technology. (reverse coded)
I believe that this technology poses a real risk to the protection of driving
data. (reverse coded)
Relational Trust I believe that this organization is trustworthy. Tax et al. 1998

1890 MIS Quarterly Vol. 45 No. 4 / December 2021


Cichy et al. / Privacy Concerns and Data Sharing in the Internet of Things

I would find it necessary to be cautious in dealing with this organization.


(reverse coded) †
I believe the organization could not be relied upon to keep its promises.
(reverse coded) †
Overall, I believe this firm is honest.
Self-efficacy The driving style assistant would help me to become better driver. Childers et al.
enhancement 2001
The driving style assistant would truly support me in my driving.
The driving style assistant would make driving easier for me.
The driving style assistant would assist my driving in a meaningful way.
Self-congruence The driving style assistant suits me well. † Sirgy et al. 1997
The driving style assistant would allow me to express my personality.
I imagine the typical user of the driving style assistant being very much like
me.
Note: †Removed from the further analysis in Study 3 due to low loadings / high cross-loadings.

Please imagine the following scenario and state how you would react in the described situation:

Modern cars are nowadays equipped with various sensors that generate all kinds of digital data while the car is in use.
Examples of such driving-related data comprise information on outdoor temperature, braking- and acceleration figures, or
travel speed.

Your car manufacturer would like to use such driving-related data for an internal research project and asks you if you if you
are willing to share such data. If you give permission to use your driving-related data, your car will automatically and in real-
time transmit the data collected to your car manufacturer. There is nothing else that you need to take care of.

How it works
Your car manufacturer promises not to disclose the driving-related data to third parties, such as other companies, your
employer or the police. Instead, your driving-related data will be solely used for internal research purposes and will be
deleted afterwards. Please assume that no technical modifications to your car are necessary to participate in this research
project. You can terminate your participation in this project at any time, without any consequences for yourself.

In concrete terms, your car manufacturer requests the following set of driving-related data (exclusively, no other data are
requested), which your car will transmit in real-time (that means continually):

> Braking and acceleration figures (Data on the frequency and intensity of your braking and acceleration maneuvers -->
these data provide insights into your driving style).

> Travel speed (how fast you drive, in km/h)

Figure A1. Sample Scenario Description Text (Study 2)

MIS Quarterly Vol. 45 No. 4 / December 2021 1891


Copyright of MIS Quarterly is the property of MIS Quarterly and its content may not be
copied or emailed to multiple sites or posted to a listserv without the copyright holder's
express written permission. However, users may print, download, or email articles for
individual use.

You might also like