You are on page 1of 10

The Information Society

An International Journal

ISSN: 0197-2243 (Print) 1087-6537 (Online) Journal homepage: https://www.tandfonline.com/loi/utis20

Wearable devices and healthcare: Data sharing


and privacy

Syagnik (Sy) Banerjee, Thomas Hemphill & Phil Longstreet

To cite this article: Syagnik (Sy) Banerjee, Thomas Hemphill & Phil Longstreet (2018) Wearable
devices and healthcare: Data sharing and privacy, The Information Society, 34:1, 49-57, DOI:
10.1080/01972243.2017.1391912

To link to this article: https://doi.org/10.1080/01972243.2017.1391912

Published online: 27 Dec 2017.

Submit your article to this journal

Article views: 3075

View related articles

View Crossmark data

Citing articles: 24 View citing articles

Full Terms & Conditions of access and use can be found at


https://www.tandfonline.com/action/journalInformation?journalCode=utis20
THE INFORMATION SOCIETY
2018, VOL. 34, NO. 1, 49–57
https://doi.org/10.1080/01972243.2017.1391912

Wearable devices and healthcare: Data sharing and privacy


Syagnik (Sy) Banerjee, Thomas Hemphill, and Phil Longstreet
School of Management, University of Michigan, Flint, Michigan, USA

ABSTRACT ARTICLE HISTORY


Wearable devices introduce many new capabilities to the delivery of healthcare. But wearables also Received 28 October 2016
pose grave privacy risks. Furthermore, information overload gets in the way of informed consent by Accepted 14 September 2017
the patient. To better protect American patients in an increasingly digital world, the U.S. Congress KEYWORDS
passed the Health Insurance Portability and Accountability Act of 1996 (HIPAA). This article HIPAA; informed consent;
examines the adequacy of HIPAA vis-a-vis issues raised by wearable technologies in the Internet of Internet of Things; privacy;
Things environment and identifies policy gaps and factors that drive health data exposure. It regulatory policy; wearables
presents a 2 £ 2 Partnership-Identity Exposure Matrix, illustrates implications in four different
contexts, and provides recommendations for improving privacy protection.

Introduction
information is often not known when it is collected (i.e.,
Wearable technologies are anticipated to have a major when notice and consent are normally given); also the
impact in the health sector. With wearables like Vital relationship between users and processors of personal
Connect’s Healthpatch MD, physicians can remotely data has become increasingly complicated, as datasets
receive updates on patient’s vitals (Terry 2014). For are combined, transferred, shared, or sold. Consent noti-
patients with amyotrophic lateral sclerosis (ALS) or ces that do not disclose identity of third parties who can
other neurodegenerative diseases, wearables under devel- access user data forestall consumers’ ability to provide
opment could scan their brainwaves, thoughts, feelings, genuinely “informed” consent. Additionally, consent
and expressions and generate alerts and commands to notices are often written so broadly, or in such volumi-
electronic devices in the room (e.g. televisions, lighting) nous detail, that they inhibit the user’s comprehension,
(Blum and Dare 2015). Wearable devices are also chang- and thus render “conscious choice” meaningless. Such
ing healthcare for obesity, cardiovascular diseases, notices, which create the illusion of consent, often
asthma, and Alzheimer’s and also in-hospital monitor- distract users from their own privacy protection
ing. They enable better patient monitoring, drug man- (Cate 2010). In effect, consumers do not really share but
agement, asset monitoring, tracking, and early medical “surrender” information (Walker 2016).
interventions. In general physicians, insurers, patients, Data collected by healthcare wearables are subject to
and caregivers are anticipated to have unparalleled access various legal protections. There are sector specific laws,
to information (FTC Staff 2015). such as the Americans with Disabilities Act, Children’s
But wearables also pose grave privacy risks, with Online Privacy Protection Act, and Fair Credit Reporting
potentially very serious consequences, including reduc- Act, not to mention federal and state laws on access to
tion in life chances. On the one hand, companies can healthcare insurance and consumer discrimination. On
gather and trade data gleaned from smartphone sensors the regulatory front, the U.S. Food and Drug Adminis-
and wearables to learn of moods, stress levels, habits, tration (FDA) regulates medical devices but not wearable
well-being, sleep patterns, exercise, movement, etc. to devices, although there have been calls to include wear-
make credit, insurance, and employment decisions able devices in FDA’s regulatory purview (Future of Pri-
compromising privacy and possibly even resulting in vacy Forum 2016). To better protect patients in an
barriers to healthcare. On the other hand, information increasingly digital world, the U.S. Congress passed the
overload gets in the way of informed consent (Cate and Health Insurance Portability and Accountability Act of
Mayer-Sch€ onberger 2013). The value of personal 1996 (HIPAA). This article examines the adequacy of

CONTACT Syagnik (Sy) Banerjee syban@umflint.edu School of Management, University of Michigan, 2120 Riverfront Center, 303 East Kearsley Street,
Flint, Michigan 48502-1950, USA.
Published with license by Taylor & Francis. © Syagnik (Sy) Banerjee, Thomas Hemphill, and Phil Longstreet
50 S (SY) BANERJEE ET AL.

HIPAA vis-a-vis issues raised by wearable technologies Hardware and Infrastructure, and Watchdogs. The
in the Internet of Things (IoT) environment.1 Healthcare IoT entities are those that are directly
The rest of the article proceeds as follows: first, a involved in the use of devices in medical treatments.
healthcare IoT technologies taxonomy is presented; They include healthcare providers (e.g., doctors, hospi-
second, the interests of different stakeholders in the tals, and laboratories) and also patients who consent to
IoT ecosystem are identified and explained; third, the the use of these devices. Bystander Beneficiaries IoT enti-
factors driving health data exposure are analyzed; ties are those that seek to capture market value by utiliz-
fourth, implications in different types of scenarios are ing advances in healthcare and well-being related IoT
drawn; and last, recommendations are offered to technologies without investing in the delivery process
improve privacy protection. (e.g., employers, insurance companies, data brokers, and
marketers). In other words, they are users of the infor-
mation provided by the IoT devices and are neither
Wearables in healthcare
involved in the manufacturing, selling, or supporting the
Applying the framework proposed by Healey, Pollard, IoT devices, nor information creation or delivery pro-
and Woods (2015), healthcare devices are categorized cess. Infrastructure IoT entities are those that make, sell,
as Wearable Health Monitoring Devices, Medical support, facilitate, or connect the devices (e.g., device
Wearable Devices, Medical Embedded Devices, and manufactures, infrastructure providers, tech support
Stationary Medical Devices (see Figure 1 below). Wear- providers, and app developers). Finally, Watchdog IoT
able health monitoring devices are consumer products entities are primarily interested in ensuring that technol-
(e.g., Fitbit, Fuelband, etc.). Medical wearable devices ogy specifications and ethical practices are met and
are prescribed devices (e.g., insulin pumps). Medically employed in use of IoT technologies (e.g., Food and
embedded devices are implanted into the body (e.g., Drug Administration, Federal Trade Commission, and
pacemakers) (Taylor 2016). Stationary medical devices Future of Privacy Forum).
are designed for use in a specific physical location (e.g., Several organizations have initiated corporate well-
chemotherapy dispensing stations for home-based ness programs, which include use of wearable devices.
healthcare). Each of the devices in the aforementioned Oscar Heath Insurance has offered its customers $1
categories has some of following attributes: Identifica- credit towards an Amazon gift card every time they
tion (serial number to provide positive identification of reach a daily step count goal (Bertoni 2014). Cigna has
the user), location (geospatial coordinates), sensing distributed armbands to thousands of its employees.
(sensors that produce alert warning signals), and con- Self-insured employers, like Autodesk, have purchased
nectivity (ability to connect with other devices and Fitbit trackers and distributed them to their employees
share information) (Dohr et al. 2010). (Olson, 2014). Oil giant BP has provided incentives to
The wearable healthcare device market is forecasted to employees for wearing Fitbit and permitting the com-
save 1.3 million lives annually (Dolan 2014b) and grow pany to track their daily work steps (Olson 2014).
to $72.7 billion by 2021 (Malik 2016). As the demand for Some employers are using Wildflower, an application
these IoT wearable devices has increased in both con- for pregnant women to measure weight gain at preg-
sumer and medical markets, the number of entities inter- nancy milestones, to reduce maternity-related medical
ested in them has also increased. They are broadly costs (Olson 2014). Conversely, some companies are
categorized as Healthcare, Bystander Beneficiaries, using punitive measures to change behavior. For
example, stickK believes that taking away wellness
points, a disincentive, is more effective than offering
rewards (Olson 2014). Some self-insured companies
are planning to add a $50 surcharge to insurance pre-
miums of employees who smoke. CVS has been charg-
ing employees $50 per month who refuse to report
weight, body fat, cholesterol, blood pressure, and sugar
levels. Overall, improved wellness compliance is
emerging, but raises vexing ethical questions about the
use of the data being captured.
Such efforts rest on the assumption that the increasing
health costs are a product of individual lifestyle choices.
In contrast, there exists another perspective that takes a
Figure 1. Healthcare Devices Typology. systemic view. In this perspective, supposed personal
THE INFORMATION SOCIETY 51

“choice” is viewed by scholars as product of the environ- that a covered entity must obtain the patient’s written
ment (Alkon and Agyeman 2011; Langston 2010) and authorization for any use or disclosure of protected
ecologies that are “inescapable” (e.g., prenatal exposure health information that is not for treatment, payment,
to toxins can lead to outcomes like obesity) (Nash 2006). healthcare operations or otherwise permitted or required
Guthman (2011) argues that obesity is a symptom of a by the Privacy Rule (U.S. Department of Health and
greater problem which can only be solved by challenging Human Services 2003).
the cultural values or economic interests of powerful On the surface, it seems that for patients receiving
entities that are invested in existing socioeconomic sys- treatment from a healthcare provider use of wearable
tems. Biltekoff (2007) goes a step further and urges indi- devices to monitor their health, any data shared with busi-
viduals to make responsible choices for protecting their ness associates are well protected, as per the Privacy Rule
own health instead of relying on government protections (@45 CFR 164.504(e)). Business associate functions and
and services, which are a product of neoliberal ideologies. activities include: claims processing or administration;
However, in practice, the individual lifestyle choices nar- data analysis, processing, or administration; utilization
rative informs most analysis and recommendations review; quality assurance; billing; benefit management;
(Bipartisan Policy Center 2012; The Physicans Founda- practice management; and repricing. Business associate
tion 2012). services include legal, actuarial, accounting, consulting,
data aggregation, management, administrative, accredita-
tion, and financial.
Health Insurance Portability and Accountability
Similarly, under Title XIII of the American Recovery
Act
and Reinvestment Act of 2009, the Health Information
The Health Insurance Portability and Accountability Act Technology for Economic and Clinical Health
of 1996 or HIPAA (U.S. Government Publishing Office (“HITECH”) Act has security requirements. For exam-
2013), Public Law 104–191, was enacted on August 21, ple, it requires business associates to comply directly
1996. Sections 261 through 264 of HIPAA require the with Security Rule provisions directing implementation
U.S. Secretary of Health and Human Services (HHS) to of administrative, physical, and technical standards for
publicize standards (also known as Administrative electronic protected health information; develop, enforce
Simplification provisions) for the electronic exchange, policies and documentation standards; report noncom-
privacy, and security of patient health information. The pliance and terminate contracts with noncompliant enti-
HHS Standards for Privacy of Individually Identifiable ties; and requires HSS to conduct compliance audits.
Health Information administrative rule (“Privacy Rule”) However, different problems emerge as far as wearable
were published on August 14, 2002 in the Code of the devices and adequacy of HIPAA are concerned. Much of
Federal Register (@45 CFR Part 160 and Part 164). the data leakage occurs due to policy gaps, explicit policy
The primary goal of the Privacy Rule is to protect violations, as well as legal but self-interest driven market-
patients’ health information while allowing the flow of place behavior by covered entities.
such information to “covered entities”, i.e., healthcare
providers, health plans, healthcare clearinghouses,
Factors increasing health data exposure
and relevant associates (U.S. Department of Health and
Human Services 2003). In addition, HHS also published In this section, we identify and discuss factors that drive
the Security Standards for the Protection of Protected the exposure of personal health data. This understanding
Electronic Health Information (“Security Rule”), which is necessary for developing effective remedies.
establishes a national set of security standards for pro-
tecting certain health information that is held or trans-
1. HIPAA
ferred in electronic form (Scholl et al. 2008). A major
goal of the Security Rule is to protect the privacy of i) Conditionality of protection: According to the Pri-
patients’ health information while allowing covered enti- vacy Rule (@45 CFR 160.103), a business associate
ties to adopt new technologies to improve the quality is defined as a person or organization that per-
and efficiency of patient care. forms certain functions or activities that involve
Obtaining written permission from individuals to use the use of disclosure of protected health informa-
and disclose their protected health information for treat- tion on behalf of, or provides services to, a covered
ment, payment, and healthcare operations is optional entity. In other words, a wearables manufacturer
under the Privacy Rule for all covered entities (U.S. or distributor that neither provides services to any
Department of Health and Human Services 2003). How- covered entity, nor uses or discloses information
ever, HIPAA, and specifically the Privacy Rule, requires on behalf of a covered entity, can be subject to the
52 S (SY) BANERJEE ET AL.

compliance requirements of business associate related to a person (birth, admission, and dis-
defined under HIPAA. Despite additional regula- charge), phone, fax and e-mail, social security
tory safeguards initiated by the healthcare sector, numbers, numbers related to medical records,
such as the Omnibus Rule, increased penalties, health plan beneficiary, accounts, certificates and
breach notification standards, and incorporation licenses, vehicle identifiers, license plates, IP
of added protected health information (PHIs) addresses, device identifiers, web URLs, biomet-
(U.S. Government Publishing Office 2013), they ric identifiers, and photographic images. How-
continue to apply to business associates, and ever, many fields of information that appear to
mHealth technologies remain “noncovered enti- be private, such as blood pressure or sleep statis-
ties” (NCEs). Even if a doctor endorses a particular tics, can be shared separately as long as they are
app, or even if healthcare providers and app devel- not identified by unique personal information.
opers enter into interoperability arrangements, iii) Definition of unique identifiers: Collected health,
HIPAA does not protect consumer data. The sta- behavior data or variables can become unique
tus of a wearable manufacturer–distributor identifiers themselves, though not recognized or
depends on whether its participation has been categorized as PHI under HIPAA. At a data con-
contracted by the healthcare provider for patient ference in New York City, Ira Hunter, the CIA’s
management services. If the contract indicates that Chief Technology Officer, said that a person can
the wearable device manufacturer engages in be hundred percent guaranteed to be identified
transmission of data on behalf of the healthcare simply by their gait, i.e., how they walk (Kelley
provider, the manufacturer is considered a covered 2013). In other words, gradually evolving fields of
entity. However, if a patient purchases a wearable behavioral data that we merely consider patterns
device from a retailer, uses it and voluntarily could be as unique as fingerprints, and therefore
shares some results with his or her doctor, who personally identifiable.
then responds with some advisory comments, the iv) Ability to bypass identification: Even if the con-
company manufacturing the device is not a cerned entity qualifies as a business associate that
HIPAA covered entity; consequently, the data col- is required to comply with HIPAA, in recogni-
lected by the manufacturer is neither bound by tion of the potential usefulness of health informa-
nor protected under HIPAA. This situation is sim- tion when not identifiable, x164.502(d) of the
ilar to sharing data after an automobile accident, Privacy Rule allows a covered entity or its
when both the healthcare insurer and automobile business associate to create information that is
insurer receive medical bills. One can be bound by not individually identifiable by following the
HIPAA, the other may not, because the health deidentification standard and implementation
insurer has not partnered and endorsed the auto- specifications in x164.514(a)-(b). If that is true,
mobile insurer under its own line of business. identification can be achieved post-sharing, by
When the HIPAA guidelines were developed, the creating and tracking unique proxy combination
increasing diffusion of mobile wearables and sub- patterns, therefore in effect bypassing the privacy
sequent proliferation of third party entities were rule.
not foreseen as a potential complication. The only
way to make wearable device manufacturers or
2. E-discovery
distributors conform to HIPAA laws is to have
them partner with covered entities. Personal health data can be made accessible for legal
ii) Protected data types: Applicability of HIPAA purposes. In 2014, a law firm in Calgary used activity
depends on what fields of information are being data from Fitbit to assess the effect of an accident on
collected and shared. The HIPAA Privacy Rule their client in a personal injury case (Olson 2014). Simi-
protects most “individually identifiable health larly, in 2015, a woman’s fitness watch disproved her
information”, such as medical records, laboratory rape report (Snyder 2015). The entry of wearable device
reports, and hospital bills, because they can iden- data into judicial proceedings through e-discovery has
tify the patient by name. The University of been facilitated by analytics platforms like Vivametrica,
California, San Francisco, uses eighteen criteria which use public health records to compare a specific
defining healthcare data covered under HIPAA person’s records with those of the general population.
(Lee 2015), which includes names, geographical Once used in judicial proceedings, this personal health
markers smaller than a state (street address, information can become part of publicly searchable
county, and zip code), elements of dates directly information under the Open Records Policy or Freedom
THE INFORMATION SOCIETY 53

of Information Act. Though there can be exemptions to patient records, i.e., estimated by volume of
preserve security and privacy, the provisions vary from patient records, were compromised in the top six
state to state (Solove 2002). attacks (i.e., by volume of compromised patient
records) (Munro 2015).
iv) Data sovereignty: Since most of the wearable
3. Data sharing modalities
device applications store the generated data in
i) Ambiguous legal status of shared data: The existing the cloud, there are a wide variety of national
public policy environment is ambiguous about the laws determining how protected the data are and
future of shared data. If an entity legitimately how safely they are stored. Earlier, “safe harbor”
shares data with a firm (neutral, third party, or oth- regulatory compliance with the European
erwise) that undergoes a change in ownership or Union’s data protection framework required data
bankruptcy, or enters bankruptcy, how is the col- processing entities to provide sufficient data pro-
lected data to be treated? Should it be treated as a tection guarantees “irrespective of their location.”
firm-owned asset? In an end user license agreement (Gibbs 2015) However, following the Snowden
(EULA), terms and conditions of wearable devices revelations regarding U.S. intelligence agencies’
often indicate that in the event of the sale of their surveillance into commercial Internet services,
company, or if in bankruptcy proceedings, the the European Union struck out the “safe harbor”
company reserves the right to sell the data provision and the consequent move towards
(Maddox 2014). When Radio Shack underwent adhering to domestic privacy policies could
bankruptcy, it requested the court to allow it to sell impede the movement of data across borders
data to pay off its creditors. AT&T and Apple filed (Meyer 2016). Similarly, Indonesia introduced
a motion to prevent the sale of data concerning Regulation PP 82/2012, which requires data ser-
customers who had purchased Apple or AT&T vice providers to store the personal data of her
devices (Isidore 2015). Similar ambiguities exist citizens within the country. Australia also insti-
regarding status of federal government data, which tuted laws and regulations for such data carried
become very apparent when there is a change in overseas. Moreover, more explicit rules are
administration. expected to be issued by the EU General Data
ii) Encryption and hardware connectivity: The exist- Protection Regulation (Bradbury 2015). The
ing encryption regime of wearable device or hard- increasing push for data sovereignty and
ware identifiers by manufacturers is inadequate. adherence of domestic laws, as opposed to
Wearable devices are usually synced via Bluetooth international frameworks, could create data
Low Energy to wirelessly connect to mobile devi- vulnerabilities (Berry and Reisman 2012). For
ces or computers. Electronics firm Symantec con- instance, start-up application developers operat-
ducted an experiment where it built portable ing within tight budgets could sign up with cloud
Bluetooth scanners using Raspberry Pi minicom- storage providers operating from countries with
puters and readily available retail components, more “flexible”, i.e., less rigorous, regulations.
which could be purchased and assembled for $75 v) Data brokers: There is a large, disorganized data
by any person with basic IT skills. Using the broker industry that has evolved in the pursuit of
scanner, Symantec employees were able to scan profits. In 2012, nine data broker firms analyzed
airwaves for signals broadcast from devices and by the FTC generated $426 million in annual rev-
extract device hardware addresses and location as enue (Federal Trade Commission 2014). Unique,
well as other personal data (Barcena, Wueest, and personal information is a high-priced commod-
Lau 2014). While location in itself is not a data ity, and it is estimated that the information con-
field that is covered by HIPAA, when connected tained in wearables stored on a smartphone or a
to an identifier, air quality and pollution estimates “cloud” is worth ten times that of credit cards in
can inform audiences about the potential health a black market (Maddox 2015). A study by the
risks of the individual. FTC found that twelve health applications that
iii) Cloud storage vulnerability: Most wearable data were tested transmitted information to seventy-
transferred between wearables and mobile devi- six different third-parties (Dolan 2014a), some of
ces are stored in “clouds.” Due to lack of associ- which were data brokers, who aggregate and sell
ated physical space or control, cloud storage can consumer data in the marketplace (Addonizio
be vulnerable to breaches by hackers. In the first 2015). What is disturbing is that much of this
quarter of 2015, approximately hundred million data consisted of unique device identifiers, social
54 S (SY) BANERJEE ET AL.

security numbers, demographics, court and pub- and (4) principles of data transmission between
lic records, travel, running routes, dietary habits, parties. These factors determine the norms of appropri-
sleep patterns, addresses, financial information, ateness (whether it is appropriate for concerned entities
full names, health information, location, date of to access and store information at hand) and distribution
birth, and other similar personally identifiable (when it is acceptable to share the same information with
information (Privacy Rights Clearinghouse external parties). We employ two factors—varying roles
2013). and information attributes—from Nissenbaum and
vi) Corporate alliances: Organizations and insurers Patterson (2016), to design a simple 2 £ 2 Partnership-
demand so-called “granular” detailed informa- Identity Exposure Matrix (see Table 1 below) to map dif-
tion, because more detailed behavioral profiles ferent scenarios of health data exposure.
offer insurers more accurate impressions of indi- We start by defining the family of terms included
vidual health risks. There have been reports under the umbrella of health information. Health
about regional hospitals, insurers, and grocery information (HI) is a broad term that includes both
retailers collaborating to transform buyers’ gro- protected health information (PHI), which contains
cery purchase data into health risk profiles unique identifiers that can be directly linked back to
(Sherbit 2015). the patient, and de-identified health information
vii) Liability minimization: Employers who imple- (DHI), which is not readily identifiable. Here it is
ment healthy work practices monitor employees important to clarify that DHI is not necessarily anony-
to ensure improved health and justify reduced mous data; in fact, data can vary from completely
insurance premiums. In order to mitigate liabili- anonymous to completely identifiable, differing on the
ties related to storage and potential misuse of extent to which identification is preserved, masked, or
data accrued while monitoring, employers often eliminated, based on the effort, skill, time, and cost
introduce neutral third party firms, such as Stay- necessary for complete identification (Nelson 2015).
well or Welltok, to help remove personal health Often, a minimal level of identifiability is desired in
information from employee records (Firestone, the data so that health researchers can combine data-
2014). There is a potential threat that such neu- sets, or conduct effectiveness or policy analysis for
tral third-party firms can “feed” or join the data greater societal benefits. To define this safety standard
broker industry to exacerbate misuse concerns of identifiability, HIPAA’s Privacy Rule provides
and liability. specific guidelines for alternative methods of de-identi-
fication (HHS 2012), thus mitigating privacy risks for
patients and enabling safe data sharing practices.
4. End user license agreements
Hence, data processed through the above methods for
End user license agreements (EULAs), whose density de-identification, i.e., DHI, is considered safer to share
alone has a significant negative effect on a users’ willing- than PHI. Also, as described earlier, vendors and
ness to read and comprehend (B€ohme and K€opsell device manufacturers who have partnered with health-
2010), often overreach as they are designed without con- care service providers are subject to HIPAA regula-
sidering the legal enforceability of their terms and condi- tions as covered entities (CEs) or business associates
tions. In the process, a wide variety of terms that are not (BAs). Those who are not partnered are not subject to
legally enforceable are included into the EULA, further HIPAA regulations, and are considered noncovered
undermining the users’ ability to understand what they entities (NCEs).
are consenting to when they start using the device. The following scenarios depict contexts where a wear-
able device manufacturer stores and shares HI with third
party data brokers. The scenarios vary on two factors.
Health data exposure contexts
In this section, we use Nissenbaum’s (2004) Contextual
Table 1. Partnership-Identity Exposure Matrix.
Integrity Framework to detect ethical violations, the
existence of policy gaps, and provide recommendations. Partnership of Device Manufacturer with Healthcare Service Provider
To assess if storage or transfer of information violates Identity Exposure Level of Shared Health
contextual integrity, one must examine four factors Information
(Nissenbaum and Patterson 2016): (1) the prevailing Shared DHI Shared PHI
context in which data is generated; (2) the key actors,
Partnered CEs A (CE, DHI) B (CE, PHI)
their roles and responsibilities; (3) key information Not Partnered NCEs C (NCE, DHI) D (NCE, PHI)
attributes;
THE INFORMATION SOCIETY 55

“Partnership” indicates whether or not the wearable (e.g., between service provider–client) among others (Nis-
device manufacturer has been contracted by a healthcare senbaum 2004). Overall, scenario B illustrates a clear viola-
service provider for patient management services and, as tion of contextual integrity, since the device manufacturer is
a result, plays the role of a CE or NCE; and “identity a covered entity (CE) that has shared identifiable PHI in
exposure” indicates whether or not the shared data was contradiction to the requirements of HIPAA. But the most
processed through a de-identification method prescribed dangerous scenario is D, which can lead to user vulnerabil-
in the HIPAA privacy laws. Identity exposure level (DHI ity without offering legal recourse, thus becoming an ethical,
or PHI) indicates whether or not a user is vulnerable to but not legal violation.
identifiable information, and partnership status (CE or When HIPAA was enacted, healthcare service pro-
NCE) indicates whether or not the user can avail legal vider’s medical record documents were the primary, if
recourse in the event of harm. not the only, sources of health information. This is the
Given these conditions, both scenarios A and C are reason why nonhealthcare entities were not included in
relatively low risk, since the user is not vulnerable in the purview of the law. However, increased penetration
either of these cases, whether they have legal recourse or of technologies capable of generating PHI, the lack of
not. Scenario “B” is a concerning scenario, where PHI laws to protect user data from NCEs, and the increasing
has been shared with a third party data broker. However, diversity of nonhealthcare providers with access to such
since the device manufacturer is a CE, the user has legal information, have together increased the risks of con-
recourse for any violation. Scenario “D” is the most con- sumer data breaches and misuse. In that light, transfer of
cerning because user PHI has been shared with a third PHI from NCEs requires greater scrutiny.
party data broker, making the user potentially vulnerable
to targeting or manipulation, and the device manufac-
turer is a NCE, hence the user has no legal recourse for
Recommendations
any potential harm.
We identify the device manufacturers as the primary Ideally, many of the problems with sharing of health data
party and examine whether they should have access to would be addressed by the industry itself. Forms of
stored user HI, and then analyze whether they should be industry self-regulation range from “pure” self-regula-
allowed to share HI with third party data brokers. If a tion, where there is no government or other stakeholder
device manufacturer is a CE, as a partner it functions as involvement in the private regulatory process, to public
an entity instrumental to healthcare service delivery, standard-setting or pricing and output setting, where
which makes it appropriate to possess or store user HI. there is a high level of direct government oversight
However, in the case of an NCE, the manufacturer is not (Garvin 1983). However, the greatest potential for
required to be HIPAA compliant, because the NCE industry self-regulation resides with a mixed form of
strictly sells or maintains medical devices, but does not industry rule-making and government oversight (Gupta
create, transmit, or receive HI on behalf of any CE. Fur- and Lad 1983). We therefore make the following
ther, if the entity is an NCE, it does not necessarily share recommendations:
the missions/visions of healthcare providers, and conse- 1. The U.S. Department of Health and Human Serv-
quently, it does not need to comply with the norms of ices should create a “watchdog” unit that is charged
the healthcare industry to which it does not belong. with identifying and monitoring types of new
Thus, the NCE device manufacturer can provide life- behavioral data that can be captured by wearable
style-related services, which involve storage of user technology. Such a unit should examine the extent
health/fitness related information, and its access to user to which data enables user identifiable. If it is high,
HI does not violate the norms of appropriateness because this type of data should be added to HIPAA’s list
it is not subject to HIPAA. of unique identifiers. In general, this list should be
While it may be appropriate for a device manufacturer to at minimum annually updated.
store HI, it does not imply the manufacturer has the right to 2. The FTC should require NCE device manufac-
further distribute stored HI. HIPAA allows covered entities turers, based on the section 5 of the FTC Act, to
to share HI only when it is not identifiable. In other words, “prevent unfair or deceptive acts or practices in or
contextual integrity is violated when PHI, which are identifi- affecting commerce” to record and track any trans-
able, are shared. Such violation of contextual integrity can actions involving sharing, sale, or distribution of
lead to informational harms (e.g., targeting of vulnerable information that would be considered PHI under
populations based on sensitive information), informational HIPAA, and appoint third party agencies or
inequality (e.g., data brokers wielding significant power over experts to assess and certify parties buying data
the fates of individual users), and disruption of relationships from NCE device manufacturers.
56 S (SY) BANERJEE ET AL.

3. While the EEOC already provides guidelines for Blum, B., and F. Dare. 2015. Enhancing Clinical Practice with
operating employee wellness programs and volun- Wearables: Innovations and Implications. Arlington, VA,
tary (and not mandatory) transmission of data USA: Accenture.
B€ohme, R., and S. K€opsell. 2010. Trained to accept? A field
from employees to employers, it should also moni- experiment on consent dialogs. In Proceedings of the SIGCHI
tor the aggregate insurance premiums of partici- conference on human factors in computing systems (pp.
pants over time to ensure that employees are not 2403–6. New York, NY, USA: ACM.
discriminated against. Bradbury, D. 2015, December 17. After safe harbour: Navigat-
4. Manufacturers should develop identification proto- ing data sovereignty. Retrieved October 6, 2016, from
http://www.theregister.co.uk/2015/12/17/navigating_data_
cols for detecting sensitive information and give it
sovereignty/
enhanced protection in routing, usage, and storage Cate, F. H. 2010. Protecting privacy in health research: The
using encryption and other techniques. They limits of individual choice. California Law Review, 96(2),
should also develop mechanisms for alerting the 1765–803.
user. Cate, F. H., and V. Mayer-Sch€onberger. 2013. Notice and con-
sent in a world of Big Data. International Data Privacy
Law, 3(2), 67–73.
Dohr, A., R. Modre-Osprian, M. Drobics, D. Hayn, and G.
Note Schreier. 2010. The Internet of Things for ambient assisted
living. ITNG, 10, 804–9.
1. The Internet of Things consists of billions of smart devices Dolan, B. 2014a, May 23. In-Depth: Consumer health and
communicating with each other. It comprises of “‘things’
data privacy issues beyond HIPAA. Retrieved October
such as devices or sensors—other than computers, smart- 24, 2016, from http://www.mobihealthnews.com/33393/
phones, or tablets—that connect, communicate, or trans- in-depth-consumer-health-and-data-privacy-issues-
mit information with or between each other through the
beyond-hipaa
Internet” (FTC Staff 2015). This networked system of con- Dolan, B. 2014b, December 16. Prediction: Health wearables to
versing devices and products is growing exponentially save 1.3 million lives by 2020. Retrieved October 24, 2016,
from an estimated two billion devices in 2006 to a fore-
from http://www.mobihealthnews.com/39062/prediction-
casted fifty billion by 2020 (Evans 2011).
health-wearables-to-save-1-3-million-lives-by-2020
Evans, D. 2011, April. The Internet of Things: How the Next
Evolution of the Internet Is Changing Everything. Cisco,
References IBSG. Retrieved from http://www.cisco.com/c/dam/en_us/
about/ac79/docs/innov/IoT_IBSG_0411FINAL.pdf
Addonizio, G. 2015. The privacy risks surrounding consumer Federal Trade Commission. 2014. Data Brokers: A Call for
health and fitness apps, associated wearable devices, and Transparency and Accountability. Washington, DC, USA.
HIPAA’s limitations. South Orange, NJ, USA: Seton Hall Retrieved from https://www.ftc.gov/system/files/docu
Law (Paper 861). Retrieved from http://scholarship.shu. ments/reports/data-brokers-call-transparency-accountabil
edu/student_scholarship/861 ity-report-federal-trade-commission-may-2014/140527data
Alkon, A. H., and J. Agyeman. 2011. Cultivating food justice: brokerreport.pdf
Race, class, and sustainability. Cambridge, MA, USA: MIT Firestone, C. M. 2014. Developing policies for the Internet of
Press. Things, background materials. Twenty-Ninth Annual Aspen
Barcena, M. B., C. Wueest, and H. Lau. 2014. How safe is your Institute Conference on Communications Policy.
quantified self. Mountain View, CA, USA: Symantec. FTC Staff. 2015. Internet of Things: Privacy and security in a
Retrieved from https://www.symantec.com/content/dam/ connected world. Technical report, Washington, DC, USA:
symantec/docs/white-papers/how-safe-is-your-quantified- Federal Trade Commission.
self-en.pdf Future of Privacy Forum. 2016. FPF Mobile Apps Study. Wash-
Berry, R., and M. Reisman. 2012. Policy challenges of cross-bor- ington DC, USA: Future of Privacy Forum. Retrieved from
der cloud computing. United States International Trade Com- https://fpf.org/wp-content/uploads/2016/08/2016-FPF-
ission, Journal of International Commerce and Economics, Mobile-Apps-Study_final.pdf
Web Version, pp. 1–38. Retrieved from https://www.usitc. Garvin, D. A. 1983. Can industry self-regulation work? Califor-
gov/journals/policy_challenges_of_cross-border_cloud_com nia Management Review, 25(4), 37–52.
puting.pdf Gibbs, S. 2015, October 6. What is “safe harbour” and why did
Bertoni, S. 2014, December 8. Oscar Health using misfit wear- the EUCJ just declare it invalid? The Guardian. Retrieved
ables to reward fit customers. Retrieved April 23, 2017, from https://www.theguardian.com/technology/2015/oct/
from http://www.forbes.com/sites/stevenbertoni/2014/12/ 06/safe-harbour-european-court-declare-invalid-data-pro
08/oscar-health-using-misfit-wearables-to-reward-fit-cus tection
tomers/ Gupta, A. K., and L. J. Lad. 1983. Industry self-regulation: An
Biltekoff, C. 2007. The terror within: Obesity in post 9/11 U.S. economic, organizational, and political analysis. Academy
life. American Studies, 48(3), 29–48. of Management Review, 8(3), 416–25.
Bipartisan Policy Center. 2012. What is Driving U.S. Health Guthman, J. 2011. Weighing in: Obesity, food justice, and the
Care Spending. America’s Unsustainable Health Care Cost limits of capitalism (Vol. 32). Oakland, CA, USA: University
Growth. Washington, DC: Bipartisan Policy Center. of California Press.
THE INFORMATION SOCIETY 57

Healey, J., N. Pollard, and B. Woods. 2015. The Healthcare tions-of-Sharing-Data-A-Primer-on-Nelson/8a091b5cc


Internet of Things: Rewards and Risks. Atlantic Council. 4d3f861c0080d7b3ddf51b717244e6c
Retrieved from http://www.atlanticcouncil.org/publica Nissenbaum, H. 2004. Privacy as contextual integrity. Wash. L.
tions/reports/the-healthcare-internet-of-things-rewards- Rev., 79, 119.
and-risks Nissenbaum, H., and H. Patterson. 2016. Biosensing in context:
HHS. 2012. Guidance regarding methods for de-identification Health privacy in a connected world. Quantified: Biosensing
of protected health information in accordance with the Technologies in Everyday Life, ed. Dawn Nafus, 79–100.
Health Insurance Portability and Accountability Act Cambridge, London: The MIT Press.
(HIPAA) Privacy Rule. Retrieved April 28, 2017, from Olson, P. 2014, June 19. Wearable tech is plugging into health
https://www.hhs.gov/hipaa/for-professionals/privacy/spe insurance. Retrieved October 23, 2016, from http://www.for
cial-topics/de-identification/index.html bes.com/sites/parmyolson/2014/06/19/wearable-tech-
Isidore, C. 2015, May 15. Apple, AT&T fight sale of Radio- health-insurance/
Shack customer data. Retrieved October 23, 2016, from Privacy Rights Clearinghouse. 2013, July 15. Privacy Rights
http://money.cnn.com/2015/05/15/news/companies/radio Clearinghouse Releases Study: Mobile Health and Fitness
shack-apple-att/index.html Apps: What Are the Privacy Risks? j Privacy Rights Clear-
Kelley, M. B. 2013, March 21. CIA Chief Tech Officer: Big Data inghouse. Retrieved October 23, 2016, from https://www.pri
is the future and we own it. Retrieved October 23, 2016, vacyrights.org/blog/privacy-rights-clearinghouse-releases-
from http://www.businessinsider.com/cia-presentation-on- study-mobile-health-and-fitness-apps-what-are-privacy
big-data-2013-3 Scholl, M. A., K. M. Stine, J. Hash, P. Bowen, L. A. Johnson, C.
Langston, N. 2010. Toxic bodies: Hormone disruptors and the D. Smith, and D. I. Steinberg. 2008. An introductory
legacy of DES. New Haven, CT, USA: Yale University Press. resource guide for implementing the Health Insurance Porta-
Lee, K. 2015, July. Wearable health technology and HIPAA: bility and Accountability Act (HIPAA) Security Rule. Gai-
What is and isn’t covered. Retrieved October 23, 2016, thersburg, MD: National Institute of Standards \&
from http://searchhealthit.techtarget.com/feature/Wear Technology, U.S. Department of Commerce.
able-health-technology-and-HIPAA-What-is-and-isnt-cov Sherbit. 2015, April 6. How insurance companies profit from
ered “wearables.” Retrieved October 23, 2016, from https://www.
Maddox, T. 2014, July 3. The scary truth about data security sherbit.io/the-insurance-industry-and-the-quantified-self/
with wearables. Retrieved October 23, 2016, from http:// Snyder, M. 2015, June 19. Police: Woman’s fitness watch dis-
www.techrepublic.com/article/the-scary-truth-about-data- proved rape report. Retrieved October 23, 2016, from
security-with-wearables/ http://abc27.com/2015/06/19/police-womans-fitness-
Maddox, T. 2015, October 7. The dark side of wearables: How watch-disproved-rape-report/
they’re secretly jeopardizing your security and privacy. Solove, D. 2002. Access and aggregation: Privacy, public records,
Retrieved October 23, 2016, from http://www.techrepublic. and the Constitution—viewcontent.cgi. Minnesota Law Review,
com/article/the-dark-side-of-wearables-how-theyre- 86. Retrieved from http://scholarship.law.gwu.edu/cgi/viewcon
secretly-jeopardizing-your-security-and-privacy/ tent.cgi?article D 2079&context D faculty_publications
Meyer, D. 2016, February 25. Here comes the post-safe harbor Taylor, H. 2016, March 4. How the “Internet of Things” could
EU privacy crackdown. Retrieved October 24, 2016, from be fatal. Retrieved September 22, 2016, from http://www.
http://fortune.com/2016/02/25/safe-harbor-crackdown/ cnbc.com/2016/03/04/how-the-internet-of-things-could-
Malik, M. A. 2016. Internet of Things (IoT) healthcare market be-fatal.html
by component (Implantable sensor devices, Wearable sensor Terry, K. 2014. Mobile polysensors offer new potential for
devices, System and software), Application (Patient monitor- patient monitoring. Medscape Medical News. Retrieved
ing, Clinical operation and workflow optimization, Clinical from http://www.medscape.com/viewarticle/828637
imaging, Fitness and wellness measurement)—Global The Physicans Foundation. 2012, November. Drivers of
Opportunity Analysis and Industry Forecast, 2014–2021. Healthcare Costs White Paper. Boston, MA. Retrieved from
Allied Market Research. Retrieved from https://www.allied http://www.physiciansfoundation.org/focus-areas/drivers-
marketresearch.com/iot-healthcare-market of-healthcare-costs-white-paper
Munro, D. 2015, December 31. Data breaches in healthcare U.S. Department of Health and Human Services. 2003. Sum-
totaled over 112 million records in 2015. Retrieved October mary of the HIPAA privacy rule. Washington, DC: Office
23, 2016, from http://www.forbes.com/sites/danmunro/ for Civil Rights. Retrieved from http://www.hhs.gov/hipaa/
2015/12/31/data-breaches-in-healthcare-total-over-112-mil for-professionals/privacy/laws-regulations/
lion-records-in-2015/ U.S. Government Publishing Office. 2013, January 25. Federal
Nash, L. L. 2006. Inescapable ecologies: A history of environ- Register. Vol. 78, Issue 17. Office of the Federal Register,
ment, disease, and knowledge. Oakland, CA, USA: Univer- National Archives and Records Administration. Retrieved
sity of California Press. from https://www.gpo.gov/fdsys/granule/FR-2013-01-25/
Nelson, G. S. 2015. Practical implications of sharing data: A 2013-01073/content-detail.html
primer on data privacy, anonymization, and de-identifica- Walker, K. L. 2016. Surrendering information through the
tion—Semantic Scholar. Semantic Scholar. Retrieved from looking glass: Transparency, trust, and protection. Journal
https://www.semanticscholar.org/paper/Practical-Implica of Public Policy & Marketing, 35(1), 144–58.

You might also like