You are on page 1of 15

Public Integrity, 20: S46–S59, 2018

Copyright # American Society for Public Administration


ISSN: 1099-9922 print/1558-0989 online
DOI: 10.1080/10999922.2018.1448218

none defined
Big Data, Big Concerns: Ethics in the Digital Age

Carole L. Jurkiewicz
University of Colorado Colorado Springs

The collection of data on citizens through digital portals is viewed by organizations as an


opportunity to create value, leverage competitive advantage, and maximize productivity and
efficiencies in service and product delivery. As a condition of accessing digital media, individuals
implicitly agree to allow the collection of data they generate while on a site, as well as content on
the devices used to access the sites, unless steps are taken to limit such access. The growth in the
amount of data collected has increased exponentially and has vastly outpaced awareness of or
concerns about privacy; liability; ownership; property rights; and ethical issues that are emerging
as citizens become aware of the consequences of trading privacy for access. While dominant in
the professional press, the scholarly literature has only just begun to investigate ethical issues in
the age of big data. This treatise will outline the scope of the issues; emerging problems; key ethical
considerations; and areas in critical need of research and development moving forward.

Keywords: artificial intelligence, big data, cybersecurity, privacy issues, technology ethics

Change is occurring at an ever-increasing rate as the pace of life in nearly all respects speeds
forward. Schmidt and Cohen (2013), note Buckminister Fuller's assertion in the early 80s that the
Knowledge Doubling Curve, wherein prior to 1900 human knowledge doubled every century, sped
to every quarter-century by the mid-40s. Today, the rate of doubling, attributable to technological
advances, is considered to be every 18 months, and it is anticipated that within a few years the whole
of human knowledge will double every 12 hours. This rapid pace of change, while creating new
opportunities and advantages, also contributes to increasing levels of anxiety (Mah, Szabuniewicz,
& Fiocco, 2016) as well as concerns that artificial intelligence (AI), made possible through big data,
will make humans obsolete (Dowd, 2017). Ethical discourse, codes, guidelines, and standards have
been put into place, but they are reactive and lag behind the sector they’re intended to regulate,
making them effectively obsolete before being announced (Jurkiewicz & Giacalone, 2016).
With the globalization of economies and the speed with which markets can shift in response
to news from around the world, the lull of predictability that once existed has been erased, along
with the relative comfort of thinking legal and ethical systems will protect us. Growth of
the digital economy requires an individual to be attuned to 24/7 economic, political, and social
markets and take near instantaneous actions to maintain propitiousness. Shifting economic
affiliations; currencies; corruption in financial and public institutions; aggressive marketing
campaigns; and increasing uncertainties with regard to job security impose additional societal
stressors (Johnson, 2009; Schmidt & Cohen, 2013).

Correspondence should be sent to Carole L. Jurkiewicz, UCCS School of Public Affairs, 1420 Austin Bluffs
Parkway, Colorado Springs, CO 80918, USA. E-mail: cjurkiew@uccs.edu
BIG DATA, BIG CONCERNS: ETHICS IN THE DIGITAL AGE S47

Humans generally eschew change as it disrupts the physiological drive for homeostasis
(Jurkiewicz, 2008) and induces elevated levels of cognitive dissonance, amplifying the effects
of social influence and facilitating inappropriate emotional reactions (Guazzini, Yoneki, &
Gronchi, 2015). Although it creates efficiencies, technology also imposes the need to learn
new software and applications as well as their argot, and to adjust to increasing demands on
individuals’ time. It also facilitates bullying, as individuals can remain anonymous and as
targeted aggression is becoming more acceptable as a social norm (Brynjolfsson & McAfee,
2016). In addition to the use of big data in the drive for market dominance, it has abetted an
increasing number of individuals and organizations whose job or avocation it is to engage in
hacking, creating computer viruses and worms, advancing the dark Internet, proposing alterna-
tive currencies such as Bitcoin, and advocating extreme causes. Such concerns often leave indi-
viduals feeling helpless and inconsequential, gullible, less trusting of others and institutions, and
that their unique identity is being manipulated and reduced to an algorithmic output (Giacalone
& Jurkiewicz, 2003). This precludes the need for psychological safety and belongingness, as
well as higher-order needs, which in turn creates additional stressors (Koltko-Rivera, 2006).
Such assaults require a balancing act, pitting the need for social connection and relevance
against concerns over manipulation and trickery. Citizens frequently seek legal solutions when
their feelings of safety and belongingness are violated, but these offer evanescent fixes at best,
and are not the ethical solutions needed to address the core issues.
Politics, both national and global, also increase societal pressures rooted in big data including
shifting loyalties; partisanship; political correctness; lobbying; campaign donations; govern-
ment and corporate corruption; illegal nonprofits; taking advantage of fears and uncertainties;
and continually morphing and complex campaign regulations (Grimmelikhuijsen & Snijders,
2016). As global alliances are reshaped by personalities, electorates, economics, social trends,
increasingly inventive methods of obfuscation as shields from transparency and accountability,
the power of lobbyists and political campaign contributions, along with misinformation move-
ments create additional pressures to keep apace. The resultant concerns about one’s existence
and identity further push ethical considerations to the background at a time they are most
needed (Jurkiewicz & Grossman, 2012). While the eGovernment literature discusses collecting
data electronically, it focuses upon employing technology to enhance service and processes
rather than the moral challenges posed by big data, a much broader, complex, and multisector
topic. The purpose here is to introduce the expanse of ethical issues related to big data, provide
examples of the scope of these issues, and suggest approaches to address these concerns.

BIG DATA

A term first used by NASA researchers in 1997 (Cox & Ellsworth, 1997), the definition of
big data is still evolving, but the central elements consistently refer to massive volumes of
information collected through technological means, accumulating at such a velocity that con-
tinually innovating information processing is required to keep pace. The phenomenon has
sparked a new economy and industry sector and has been the catalyst for rapidly transformative
technologies and applications. Given the vast range and amount of data, its utility has led to
systems that improve the quality of life, proprietary and government uses that many view as
secretive and manipulative, and ethical concerns that have yet to be fully articulated or addressed.
S48 JURKIEWICZ

The majority of information that people generate through the use of technology is accessible
and available, or hackable, by anyone with access. While freely offered, the majority of indi-
viduals presume some sense of privacy or ownership regarding the content they generate
(Johnson, 2009; Riglian, 2012). Recognizing this presumption and wanting to avoid conflicts
arising from individuals’ beliefs about privacy violations, social media companies and
government agencies have instituted business models which are dependent upon this shared
content, but do not openly disclose that they are continually tracking and recording not only
the content of the information sent, but the demographics; psychographics; health; geographical
location; devices; search histories; purchasing histories; and other data about the user. Whether
it is a Website; social media forum; text; application; software; or anything accessed through
technological devices, individuals, by virtue of use, are implicitly agreeing to allow those com-
panies to bundle and use or sell that data without hindrance. Everything one says or does using
technology effectively creates a currency from which others are profiting.
Programs embed decision-making with de facto ethical implications in their design, and by
so doing are predetermining options to act with regard to such products as self-driving autos;
smart homes; smart autos; GPS; missile defense systems; educational delivery; clothing; music;
and cyber security, to name a few (Bonnefon, Shariff, & Rahwan, 2016; Getha-Taylor, 2016).
Thus, coders, about whom little is known, are subcontracted by or an employee of a technologi-
cal entity to create the algorithms, often prejudicial (Chiel, 2018), that effectively restrict range
of moral choice (Jurkiewicz, 2002). What we see, have access to, or are aware of is limited by
these algorithms, and their ethical impacts are unknown until the consequences are revealed
(Etzioni, 2014), primarily through news reports such as those about self-driving autos deciding
suicide by collision is preferred over impacting another vehicle; crashing into trucks the color of
which the system cannot differentiate from a skyline; hacking of military defense systems,
medical records, utilities, and the financial infrastructure; the usurping of votes in elections;
fake advertisements feeding tendencies toward groupthink and fear aversion; and creating tech-
nological dependencies in order to exercise influence, to name a few. Additional concerns
include governments spying on citizens; profiling by criminal justice entities; ownership and
copyright exclusions; blurred lines between privacy guarantees and full disclosure (Hastreiter,
2017); inaccurate data leading to invalid algorithms; fake identities and cyber theft; and student
cheating and declining trust in higher education.

EXAMPLES OF ETHICAL CONCERNS ACROSS THE SPECTRUM

Below is a sampling of ethical issues regarding big data that organizational; social; legal; and
political toolkits do not address. They represent a spectrum of current topics that affect society,
sorted by themes to facilitate connecting examples to key ethical topics. While there are many
benefits to be extracted from big data, the focus is on augmenting the need for governments to
establish protections and recourse for citizens harmed by the culling of personal transmissions.

Data Collected Under the Guise of Social Betterment

Collecting data on individual behaviors and preferences for profit while publicly framing it
as a benefit is a key ethical concern. One example is the new iPhone X whose facial recognition
BIG DATA, BIG CONCERNS: ETHICS IN THE DIGITAL AGE S49

system is touted as a security benefit while collecting this data to create saleable content (Poell,
2015; Stanley, 2017) to governments; criminal organizations; proprietary organizations; and to
expand Apple’s own product offerings. Additionally, the large genomic database to which indi-
viduals are paying to contribute, under the innocuous promise of discovering one’s ancestry, is
being sold to insurers; healthcare institutions; research companies; criminal justice entities; and
attorneys, among other entities (DNAeXplained, 2015).
Blackboard and Turnitin, two software tools widely used by many universities as incidences
of plagiarism increase (AIC, 2018; Anonymous Academic, 2017), also track all accessible
information on our computers or included in course content, student/professor communications,
and grade reports (Boyd, 2011). Selling online courses with content created by professors using
Blackboard and its affiliate programs, and selling students’ term papers submitted to Turnitin
(Blackboard, 2016; Green, 2012; Hendry, 2009; Lieberman, 2017; Neal, 2014; Roll, 2017;
Young, 2008; Zimmerman, 2008) is a growing market (e.g., U.S. SEC, 2015).
Sex robots that mimic behaviors displayed in pornography, collected through big data, are
now being produced for multiple reasons, including sexual pleasure; therapy; enacting fantasies
of rape; pedophilia; and prostitution. While advocated as tools to lesson sexual behaviors that
are generally viewed as both immoral and illegal, in practice, it is actually serving to normal-
ize them and increasing their incidence in the populace. Cyborg sex cafes, wherein one can
order a robotic sexual experience along with a beverage, are becoming more widespread,
as is the spread of sexually transmitted diseases resultant of shared equipment (Siddique,
2017). Further, successful lawsuits have been brought against sex toy manufacturers for
violating privacy rights by collecting and selling data transmitted through their usage (Baker,
2017).

Facilitating Unequal Wealth Distribution

Concerns about AI include the dehumanization of individual choice, essentially replacing


individual identity with collective, computerized model citizens and employees (Cellan-
Jones, 2014; Garling, 2014; Simonite, 2017). By 2025, roughly 25–35% of today’s jobs
will be replaced by AI tools developed from big data (Wakefield, 2015), resulting in an
expected $20 trillion profit by eliminating human resource costs. Such shifts will negatively
affect lower social economic groups disparately, as unskilled jobs will be eliminated first,
protecting those of greater complexity and further deepening the divide between economic
classes (Flynn & Robu, 2017). This is expected to lead to greater social strife, extreme
concentrations of power, and widespread government dependence (World Economic Forum,
2016).
Collection of big data voluntarily (under the auspices of making our lives easier and fitting in
with social groups), and involuntarily (by using software and applications that implicitly allow
the collection of our data from any device and for whatever purposes the aggregator decides) is
expected to increase. Deep-learning algorithms are continuing to be developed that build the
large data sets corporations such as Facebook and Google use to track all interactions with their
sites as well as where we travel; what we see; what we consume; and with whom we interact;
among other parameters. Google, among many organizations, is training robots to understand
and behave like humans by watching YouTube videos (Bagot, 2017).
S50 JURKIEWICZ

Individuals and organizations that are able to afford advanced computer systems, collections
of big data, and people to operate and analyze it are able to conduct financial trades ahead of
those who do not have these systems, affording them much higher profits. Additionally, the
Security and Exchange Commission lacks the resources to configure systems that maintain
continual security coverage to protect us from those seeking to profit by overriding its outdated
systems, unencrypted software, and inadequate firewalls; it is a common and lucrative target for
hackers (Price, 2017).
Cashless societies, advocated by the United Nations, are supported as solutions to theft,
counterfeiting, and increasingly volatile financial valuations, allowing governments to track
unreported income and considered a concept that furthers wealth disparities and disadvan-
tages the most impoverished individuals (Sorrel, 2016). It disallows for negotiation; cash
transactions in areas of high illiteracy; street-level charity; flexibility cash economies proffer-
ing those in desperate political, social, and domestic situations; and fails to take into account
the lack of financial institutions in remote areas with unreliable electronic infrastructure
(Taylor, 2017).

Increasing Social Hostilities for Political Gain

The use of fake news bots and “alternative fact” Websites in the 2016 U.S. presidential election
exposed approximately 25% of the U.S. population to falsehoods about then candidate Trump,
10% of whom returned often to these sites despite being made aware of the falsehoods (Guess,
Nyhan, & Reifler, 2018). Tracking these contacts, algorithms were created that fed viewers
more of the same, expanding exposure to misinformation. Spreading false information through
social media, e-mails, and phishing scams primes the population to accept misinformation as the
new normal, arguably threatening the basis of democracy (Clifford, 2017; Quick, 2016).
Although an ethically bereft sense of shared identity, it is effective toward the end of mobilizing
support for polarizing leaders and their initiatives (Breland, 2017; Haslam, Reicher, & Platow,
2010; Kaplan, 2017). Russian influence in the U.S. election of 2016 is another manifestation of
how big data can influence democracy (Graff, 2017); it is also used to predetermine which Con-
gressional bills will pass and to influence individual voters. Cambridge Analytica is credited
with creating voter profiles used to facilitate Trump’s election (Confessore & Hakimmarch,
2017; Polonski, 2017).
Social media sites allow for the targeting of specific demographics that facilitate access
by hate groups, which accumulate information on users and target segments of the populace
for discrimination. Spam e-mails and ads framed as supportive sites encourage clicks and
advancing prejudice (Chiel, 2018); viewers are enticed to express hatred toward specific groups,
and in turn are identified and supported in their hate speech through targeted advertisements and
calls to action (Byrne, 2017; Collins, Poulsen, & Ackerman, 2017).

Concealed Data Collection for Profit

Ubiquitous camera systems, ostensibly for safety or convenience, generate data that are
aggregated and resold at considerable profit. One’s sexual orientation can be determined with
91% accuracy by analyzing facial features, information that has been used to discriminate,
BIG DATA, BIG CONCERNS: ETHICS IN THE DIGITAL AGE S51

disadvantage, and violate individual privacy (Kosinki & Wang, 2017). These visual databases
have been used by governments for criminal capture, security-based identification, and political
and social activism (Gershgorn, 2017; Scassa, 2017), as well as lip-reading with instant
language translations, furthering concerns about invasions of privacy (Murphy, 2016).
Products such as Apple’s Siri, Amazon Echo, Microsoft’s Cortana, and Google Home
promise to streamline human activities and efficiencies, yet collect and record all sounds, even
when not in use. The data provide unprecedented access to private information, and bolster the
specificity and scope of big data collection (Cassano, 2017), as well as the value of such data in
the marketplace. Similarly, all “connected cars” automatically send data on driver behavior;
location; mechanicals; and more to feed a data bundle of which the market value is predicted
to be in the billions by 2020 (Gitlin, 2018).

Facilitating Human Dependencies on Technology

Creating big data-based applications that decrease the need for complex, logical thinking is
reducing these abilities in humans, deepening dependencies upon technologies and lessening
the ability to tackle crisis situations that applications do not address (Coopersmith, 2017;
Firestone et al., 2012). Because technologies are making more decisions on our behalf, and
because the algorithms behind them are indecipherable to us, the need for safeguards is vital
(Kuang, 2017).
Social media and applications such as Facebook, Uber, and Google once downloaded allow
access to communications, stored data, places traveled, and sites searched on one’s device, even
when the application is closed. This information can be sold; partitioned; aggregated; or used in
any way the organization chooses (Sulleyman, 2017). Facebook’s analog company, Onavo,
offers a free application promoted as keeping your data safe, yet it collects and feeds all infor-
mation to Facebook, including information on other applications a user accesses in an effort to
maximize market domination (Seetharaman & Morris, 2017; Sulleyman, 2017).
Instantaneous social communication and feedback fuels the rise in narcissism and, given
the social construction of reality, determines acceptable opinions; appearance; behaviors; social
aggression; social displays; and is being manipulated to influence decision-making (Chamorro-
Premuzic, 2014; Marshall, Lefringhausen, & Ferenczi, 2015; Pearse, 2012; Twenge, 2013).
Dependence upon social media for one’s self-esteem, coupled with other big data influencers,
is believed to be the cause of increasing depression, suicide, and social and psychological
dysfunction among U.S. youth (Atanasova, 2016; Buffardi & Campbell, 2008; McCain &
Campbell, 2017; Weiser, 2015).

Undermining Individual Integrity

The phenomenon known as the bystander effect (Latane & Darley, 1969) has been expanded
through smart technologies (Stanley, 2017). By providing an audience, they facilitate the
filming of acts of violence; deaths; discrimination; and life-threatening circumstances without
offering assistance as one is likely to do if acting alone (Badalge, 2017). These norms, spread
more rapidly through big data and have greater impact than when the theory was first asserted;
S52 JURKIEWICZ

the most negative, hate-filled, and discriminatory messages are most contagious on social media
and most influential in establishing social and ethical norms (Pressler, 2017).
Axon, a police camera company, has been collecting data recorded on its devices, parsing it,
and selling it to a variety of organizations for a wide range of purposes. It discovered that
providing free cameras to police departments not only gained it goodwill and cemented future
sales, but also created a separate and more lucrative market for selling this data to governments;
agencies; proprietary organizations; software developers; and numerous others. Such data
challenges both the legal provisions against invasion of privacy as well as introduces several
ethical concerns (Cassano, 2017).

SUMMARY

This overview provides a sense of the scope of the ethical issues introduced by big data. While
collecting data on almost everyone about everything is possible, should we? What do we know
about what is collected about us, how it is used, and how it is unwittingly influencing the
choices we make? Evidence of violations of privacy and free will are just now coming to light,
and it is highly likely that what we don’t know is more frightening and pervasive than what we
do. As the rate of change continues to increase, how can we keep current, let alone anticipate
what is to come? Thomson (2015) has ventured that by 2025, we will work with robots on a
regular basis, including some responsible for life-and-death issues, as well as those serving
on corporate boards and providing healthcare; that over 50%of Internet activity will comprise
control functions for offices, homes, and transportation; that functional human organs and
autos will be 3-D printed; and that cities will have streets without traffic signals, to name some
anticipated developments. With an expected 90% of the human population having Internet
connectivity and free, unlimited data storage, the ability for big data to exercise influence will
increase exponentially. The need to address questions of ethicality related to this phenomenon is
overdue.

NEXT STEPS

Calls for regulation of big data and protection of individual data are growing. As widely
reported, we know that self-policing by big data firms does not work (cf. Fair, 2017; Lapowsky,
2015). Industry and professional ethical codes have not worked either (cf. Bohannon, 2016;
Bynum, 2016). No laws in the United States specifically target big data (Wessing, 2014),
and those tangential to the issue have also largely failed. The reasons range from an assertion
that capitalism drives profit at any cost (Walker, 2015); political contributions wield influence
over lawmakers (Glazek, 2017); lack of ethics code enforcement (Bynum, 2016); a dearth of
legal statutes (Wessing, 2014); and the pervasiveness and power of big data’s lobbying industry
(LaPira & Thomas, 2017). Some scholars have called for principle-based ethical
models developed and monitored by professional associations (Moor, 2008; Wiener, 1964),
but without enforcement mechanisms, it is unlikely these would be effective. The European
Union instituted stringent laws regarding the collection and use of big data, and requires all
algorithms to be readily and clearly understandable to the citizenry; the laws have strong
BIG DATA, BIG CONCERNS: ETHICS IN THE DIGITAL AGE S53

enforcement penalties in the billions of dollars for those who do not comply (Kuang, 2017).
This approach—and other similar statues specifically targeting big data—are needed in the
United States, as are those to prevent dissolution of their enactment and enforcement through
political influence.

AN IMMODEST PROPOSAL

The United States is behind the curve on establishing boundaries for the collection and use of
big data, and every day it falls tens of trillions of pieces of data further behind. What to do, and
where to start? Adopting laws and penalties specific to big data regulation similar to those in the
EU is a start. These laws should require organizations to seek approval of data collection and
demonstrate how it specifically fits with their business model. In addition, laws with stiff
penalties to halt the influence of lobbying, gifts, and political contributions need to be
developed immediately, with executive action in the interim. Accountability should be attached
to those at the top of organizations that violate these laws, and they should be fired and
disallowed from conducting business in any data-related capacity into the future. Current
privacy laws and those covering property and ownership of data need to be updated and
strictly enforced, with extreme penalties for violators. The laws must also encompass account-
ability for the mutability of names related to big data, as renaming big data to data analytics,
and artificial intelligence to artificial machines, for example, can render laws without such
flexibility unenforceable.
In addition to the urgent need for objective research to understand fully the dimensions of
ethicality in big data and upon which effective policies can be developed, is the need for citizens
to shed their complacency and automatic acceptance of new technologies and software.
Humans, being fundamentally social beings, will gravitate to systems that facilitate communi-
cation, and thus need to be educated about how and when their data are being sourced, with
accessible and responsive authorities to whom they can report violations. They need to be made
aware of their legal and constitutional rights, and all organizations collecting data of any type
from individuals must engage in radical transparency by fully disclosing that fact and how the
data will be used in simple and direct language. The need for similar laws in all countries would
quell the ability to outsource unethical operations.
Additionally, ethical codes need to be updated regularly and accompanied by powerful
enforcement and penalty tools to assure compliance. Ethics officers should be required in all
organizations above a set size budget, with the reporting structure for these officers being an
independent government or legal body. Given that big data is a global phenomenon, and that
cultural issues have traditionally posed impediments in crafting an ethical code that bridges
these differences (Jurkiewicz, 2012), Wiener’s (1954) classic, The Human Use of Human
Beings, the first to address the impact of technology on values such as health; happiness;
freedom; security; and quality of life, offers a framework for ethics and technology that may
provide the basis for this common foundation (adapted in Table 1), a possible rubric against
which to compare differing assertions on ethicality.
This is only a start, and it will be expensive; it is suggested that all organizations that profit
from big data be required to donate a set percentage of pretax gross profits to a fund to support
these and future efforts. Such a requirement should have no exceptions and should be
S54 JURKIEWICZ

TABLE 1
Tenets for the Development of a Global Code of Ethics for Big Data

Benevolence: Individuals should be respected and valued in equal proportions simply by virtue of being human.
Equality: What serves as justice between two individuals or an individual and/or entity should remain the same when
the roles of those two individuals and/or entities are reversed.
Freedom: Individuals should have the autonomy to achieve the full expression of each of the human capabilities s/he
possesses.
Independence: The satisficing required of individuals to participate in society should impinge upon their liberties only
as is essential.

Source: Adapted from Wiener, 1954, p. 106.

enforceable to the extent that organizations can be dissolved if they fail to comply. These asser-
tions are stringent, but nothing short of this will have any effect in reigning in the ethical viola-
tions connected with big data.

CONCLUSION

Along with the benefits of big data, such as efficiency, scientific advances, and minimizing
the distance between individuals around the globe, come an increasing number of ethical con-
cerns. Driven by the government and proprietary sectors, the growth of unregulated big data
poses a threat to social and political systems; our psychological, emotional, and physical
health; financial and educational systems; economic welfare; social civilities; and individual
identity.
In addition to the need for objective research on which effective policies can be developed
is the concomitant need for legislative action to protect individual rights and to hold
organizations and individuals legally accountable for invasions of privacy and lack of trans-
parency. While legal action is not a substitute for ethical mobilization, it can move much
more quickly in protecting societies and individuals as ethical systems are developed and
implemented.
An overarching concern related to the interwoven nature of big data across borders is the
question of who should decide the ethical and legal limits, or if they should be imposed
at all. Those with the fastest growing digital economy? This would put Singapore, the
UAE, and New Zealand in charge. The recognized technology up-and-comers? Saudi
Arabia, Latvia, China, and Portugal, to name just four. Those that are on rapid
ascent, among them Egypt, Pakistan, and Nigeria? The United States is included in the
group of countries where the digital economy has stalled, along with South Korea, the
Netherlands, and Denmark (Chaturvedi, Bhalla, & Chakravorti, 2017), so should their
voices be diminished? Who has the authority to make such pronouncements, and what
would lead others to follow?
As top technology leaders, including Elon Musk, Bill Gates (Simonite, 2017), and Stephen
Hawking (Cellan-Jones, 2014), have articulated, big data is the most serious threat we face, and
to fail to address the related ethical concerns could and likely will lead to the end of the human
race (Bagot, 2017).
At what point will such a warning be moot?
BIG DATA, BIG CONCERNS: ETHICS IN THE DIGITAL AGE S55

REFERENCES

Academic Integrity Council (AIC). (2018). A report on the 2005–2006 survey. Durham, NC: Duke University Press.
Retrieved from http://integrity.duke.edu/reports/Survey%202005,%20Report,%20Final%20version,%20May%
2025,%202006.doc
Anonymous Academic. (2017, October 12). Students cheat in ever more creative ways: How can academics stop them?
The Guardian. Retrieved from https://www.theguardian.com/higher-education-network/2017/oct/12/students-in-
ever-more-creative-ways-how-can-academics-stop-them
Atanasova, A. (2016, November 26). How to spot a narcissist on social media. Social Media Today. Retrieved from
http://www.socialmediatoday.com/social-networks/how-spot-narcissist-social-media
Badalge, K. N. (2017, June 12). Smartphones haven’t made us into activists: They’ve turned us into helpless bystanders.
World Economic Forum. https://www.weforum.org/agenda/2017/06/your-phone-might-make-you-feel-like-an-
activist-but-its-preventing-you-from-really-helping-1
Bagot, M. (2017, October 26). Google is training robots to understand humans by making them binge-watch YouTube
videos. The Mirror. Retrieved from http://www.mirror.co.uk/tech/google-training-robots-understand-humans-11413196
Baker, V. (2017, March 15). Can a sex toy spy on you? BBC News. Retrieved from http://www.bbc.com/news/world-us-
canada-39280941
Blackboard Lifecycle Services. (2016). Washington, DC: Blackboard. Retrieved from http://www.blackboard.com/
sites/student-services
Bohannon, J. (2016). Who’s downloading pirated papers? Everyone. Science, 352(6285), 508–512. doi:10.1126/
science.352.6285.508.
Bonnefon, J., Shariff, A., & Rahwan, I. (2016). The social dilemma of autonomous vehicles. Science, 352(6293),
1573–1576. doi:10.1126/science.aaf2654.
Boyd, R. (2011, August 29). Blackboard: A tale of 2 companies. Seeking Alpha. Retrieved from https://seekingalpha.
com/article/290299-blackboard-a-tale-of-2-companies
Breland, A. (2017, September 9). Trump supporters dig up personal information on thousands of Trump opponents. The
Hill. Retrieved from http://thehill.com/policy/technology/351825-trump-supporters-have-built-a-massive-list-of-
their-opponents-personal
Brynjolfsson, E., & McAfee, A. (2016). The second machine age: Work, progress, and prosperity in a time of brilliant
technologies. New York, NY: W. W. Norton & Co.
Buffardi, L. E., & Campbell, W. K. (2008). Narcissism and social networking Web sites. Personality and Social
Psychology Bulletin, 34(10), 1303–1314. doi:10.1177/0146167208320061.
Bynum, T. (2016, winter). Computer and Information Ethics. In E. N. Zalta (Ed.), The Stanford encyclopedia of philosophy.
Stanford, CA: Stanford University. Retrieved from https://plato.stanford.edu/archives/win2016/entries/ethics-computer.
Byrne, B. P. (2017, September 16). Twitter says it fixed “bug” that let marketers target people who use the N-word. The
Daily Beast. Retrieved from http://www.thedailybeast.com/twitter-lets-you-target-millions-of-users-who-may-like-
the-n-word
Cassano, J. (2017, August 16). Police body camera company, axon, is vacuuming in data, stoking privacy concerns. IB
Times. Retrieved from http://www.ibtimes.com/political-capital/police-body-camera-company-axon-vacuuming-
data-stoking-privacy-concerns-2579107
Cellan-Jones, R. (2014, December 2). Stephen Hawking warns artificial intelligence could end mankind. BBC News.
Retrieved from http://www.bbc.com/news/technology-30290540
Chamorro-Premuzic, T. (2014, March 13). Sharing the (self) love: The rise of the selfie and digital narcissism. The
Guardian. Retrieved from https://www.theguardian.com/media-network/media-network-blog/2014/mar/13/selfie-
social-media-love-digital-narcassism
Chaturvedi, R., Bhalla, A., & Chakravorti, B. (2017, July 18). These are the world’s most digitally advanced countries.
World Economic Forum. Retrieved from https://www.weforum.org/agenda/2017/07/these-are-the-worlds-most-
digitally-advanced-countries
Chiel, E. (2018, January 23). The injustice of algorithms. New Republic. Retrieved from https://newrepublic.com/
article/146710/injustice-algorithms
Clifford, V. (2017). News brands are fighting fake news to ensure it does not become “new normal” in 2018.
Campaignlive. Retrieved from https://www.campaignlive.co.uk/article/news-brands-fighting-fake-news-ensure-
does-not-become-new-normal-2018/1452836
S56 JURKIEWICZ

Collins, B., Poulsen, K., & Ackerman, S. (2017, September 17). Russia used Facebook events to organize anti-immigrant
rallies on U.S. soil. The Daily Beast. Retrieved from http://www.thedailybeast.com/exclusive-russia-used-facebook-
events-to-organize-anti-immigrant-rallies-on-us-soil
Confessore, N., & Hakimmarch, D. (2017, March 6). Data firm says “secret sauce” aided Trump; many scoff. New York
Times. Retrieved from https://www.nytimes.com/2017/03/06/us/politics/cambridge-analytica.html
Coopersmith, J. (2017, June 30). Is technology making us dumber or smarter? Yes. The Conversation. https://theconversation.
com/is-technology-making-us-dumber-or-smarter-yes-58124
Cox, M., & Ellsworth, D. (1997). Application-controlled demand paging for out-of-core visualization. Proceedings
from VIS ‘97: The Eighth International Conference on Visualization, Los Alamitos, CA. Retrieved from https://
dl.acm.org/citation.cfm?id=267068
DNAeXplained—Genetic Genealogy. (2015, Dec. 30). 23 and Me, Ancestry and selling your DNA information.
Retrieved from https://dna-explained.com/2015/12/30/23andme-ancestry-and-selling-your-dna-information
Dowd, M. (2017, March 26). Elon Musk’s billion-dollar crusade to stop the A.I. apocalypse. Vanity Fair. Retrieved
from https://www.vanityfair.com/news/2017/03/elon-musk-billion-dollar-crusade-to-stop-ai-space-x
Etzioni, A. (2014). The new normal: Finding a balance between individual rights and the common good. New York,
NY: Routledge. doi:10.4324/9781315133447.
Fair, L. (2017, August 15). FTC says Uber took a wrong turn with misleading privacy security promises. Federal Trade
Commission News. Retrieved from https://www.ftc.gov/news-events/blogs/business-blog/2017/08/ftc-says-uber-
took-wrong-turn-misleading-privacy-security
Firestone, R. W., Firestone, L., and Catlett, J. (2012). The self under siege: A therapeutic model for differentiation. New
York, NY: Routledge.
Flynn, D., & Robu, V. (2017, August 16). Invasion? Takeover? Opportunity? What the robots mean for jobs. World
Economic Forum. Retrieved from https://www.weforum.org/agenda/2017/08/invasion-takeover-opportunity-what-
the-robots-mean-for-jobs
Garling, C. (2014, January 31). As artificial intelligence grows, so do ethical concern s. SFGate. Retrieved from http://
www.sfgate.com/technology/article/As-artificial-intelligence-grows-so-do-ethical-5194466.php
Gershgorn, D. (2017, August 27). The age of AI surveillance is here. Quartz Media. Retrieved from https://qz.com/
1060606/the-age-of-ai-surveillance-is-here
Getha-Taylor, H. (2016). The problem with automated ethics. Public Integrity, 19(4), 299–300. doi:10.1080/
10999922.2016.1250575.
Giacalone, R. A., & Jurkiewicz, C. L. (2003). Workplace spirituality: On the need for measurement. Journal of Orga-
nizational Change Management, 16(4), 396–399. doi:10.1108/09534810310484154.
Gitlin, J. M. (2018, February 22). Car companies are preparing to sell driver data to the highest bidder. Ars Technica.
Retrieved from https://arstechnica.com/cars/2018/02/no-one-has-a-clue-whats-happening-with-their-connected-
cars-data
Glazek, C. (2017, December 14). How to spend money on elections—And actually win. Fortune. Retrieved from http://
fortune.com/2017/12/14/political-campaign-donations
Graff, G. (2017, August 13). A guide to Russia’s high tech tool box for subverting US democracy. Wired. Retrieved
from https://www.wired.com/story/a-guide-to-russias-high-tech-tool-box-for-subverting-us-democracy
Green, K. C. (2012, April 2). The long (and open?) view on blackboard. Inside Higher Ed. Retrieved from https://www.
insidehighered.com/blogs/digital-tweed/long-and-open-view-blackboard
Grimmelikhuijsen, S., & Snijders, B. (2016). What happens after the storm? Investigating three conditions under which
local governments change integrity policy after scandals. Public Integrity, 18(4), 342–358. doi:10.1080/
10999922.2016.1172931.
Guazzini, A., Yoneki, E., & Gronchi, G. (2015). Cognitive dissonance and social influence effects on preference judgments:
An eye tracking based system for their automatic assessment. International Journal of Human-Computer Studies, 73,
12–18. doi:10.1016/j.ijhcs.2014.08.003.
Guess, A., Nyhan, B., & Reifler, J. (2018, January 9). Selective exposure to misinformation: Evidence from the
consumption of fake news during the 2016 U.S. presidential campaign. European Research Council. http://www.
dartmouth.edu/~nyhan/fake-news-2016.pdf
Haslam, S. A., Reicher, S. D., & Platow, M. J. (2010). The new psychology of leadership: Identity, influence and power.
London, UK: Psychology Press. doi:10.4324/9780203833896.
Hastreiter, N. (2017, September 28). What’s the future of cybersecurity? Huffington Post. Retrieved from https://www.
huffingtonpost.com/entry/ask-the-thought-leaders-whats-the-future-of-cybersecurity_us_592d8be7e4b07d848fdc068b
BIG DATA, BIG CONCERNS: ETHICS IN THE DIGITAL AGE S57

Hendry, E. (2009, August 3). Students reach settlement in Turnitin suit. Chronicle of Higher Education. Retrieved from
https://www.chronicle.com/blogs/wiredcampus/students-reach-settlement-in-turnitin-suit/7569
Johnson, D. G. (2009). Computer ethics. New York, NY: Pearson.
Jurkiewicz, C. L. (2002). The influence of pedagogical style on students’ level of ethical reasoning. Journal of Public
Affairs Education, 8(4), 263–274. doi:10.2307/40215580.
Jurkiewicz, C. L. (2008). Class and crisis: Socioeconomic status and the ethics of ind ivid ualexperience. Public
Manager, 38, 3. Retrieved from https://www.td.org/magazines/the-public-manager/class-and-crisis-socioeconomic-
status-and-the-ethics-of-individual-experience
Jurkiewicz, C. L. 2012.Developing a multicultural organizational code of ethics rooted in the moral obligations of
citizenry. Public Organization Review, 12(3), 243–249. doi:10.1007/s11115-012-0187-6.
Jurkiewicz, C. L., & Giacalone, R. A. (2016). Organizational determinants of ethical dysfunctionality. Journal of
Business Ethics, 136(1), 1–12. doi:10.1007/s10551-014-2344-z.
Jurkiewicz, C. L., & Grossman, D. (2012). Evil at work. In C. L. Jurkiewicz (Ed.), The foundations of organizational
evil. Armonk, NY: M.E. Sharpe. doi:10.4324/9781315699745.
Kaplan, A. (2017, July 7). Pro-Trump media claim “shadow President” Obama is violating the Logan Act. Media
Matters. Retrieved from https://www.mediamatters.org/blog/2017/07/07/Pro-Trump-media-claim-shadow-President-
Obama-is-violating-the-Logan-Act/217176
Koltko-Rivera, M. E. (2006). Rediscovering the later version of Maslow’s hierarchy of needs: Self-transcendence and
opportunities for theory, research, and unification. Review of General Psychology, 10(4), 302–317. doi:10.1037/
1089-2680.10.4.302.
Kosinski, M., & Wang, Y. (2017, July 4). Deep neural networks are more accurate than humans at detecting sexual
orientation from facial images. Open Science Framework. Open Science Framework.
Kuang, C. (2017, November 26). Can A.I. be taught to explain itself? New York Times Magazine, pp. 46–53. Retrieved
from https://www.nytimes.com/2017/11/21/magazine/can-ai-be-taught-to-explain-itself.html
LaPira, T. M., & Thomas III, H. F. (2017). Revolving door lobbying: Public service, private influences, and the unequal
representation of interests. Lawrence, KS: University Press of Kansas. doi:10.2307/j.ctt1qft06g.
Lapowsky, I. (2015). New Facebook rules show how hard it is to police 1.4B users. Wired. Retrieved from https://www.
wired.com/2015/03/facebook-guidelines
Latane, B., & Darley, J. (1969). Bystander “apathy.” American Scientist, 57, 244–268. doi:10.2307/27828530.
Lieberman, M. (2017, November 8). It’s all in the data. Inside Higher Ed. Retrieved from https://www.
insidehighered.com/digital-learning/article/2017/11/08/university-system-maryland-standardizes-data-collection-
improve
Mah, L., Szabuniewicz, C., & Fiocco, A. J. (2016). Can anxiety damage the brain? Current Opinion in Psychiatry,
29(1), 56–63. doi:10.1097/yco.0000000000000223.
Marshall, T. C., Lefringhausen, K., & Ferenczi, N. (2015). The Big Five, self-esteem, and narcissism as predictors of
the topics people write about in Facebook status updates. Personality and Individual Differences, 85, 35–40.
doi:10.1016/j.paid.2015.04.039.
McCain, J., & Campbell, W. K. (2017, November 10). Narcissism and social media use: A meta-analytic review.
Psychology of Popular Media Culture. doi:10.1037/ppm0000137.
Moor, J. (2008) Why we need better ethics for emerging technologies. In J. van den Hoven & J. Weckert (Eds.), Infor-
mation technology and moral philosophy (pp. 26–39). Cambridge, UK: Cambridge University Press. doi:10.1017/
CBO9780511498725.003.
Murphy, B. J. (2016, November 25). Lip reading skills by Google’s AI is on the fleek. Serious Wonder. Retrieved from
http://www.seriouswonder.com/lip-reading-skills-google-ai-on-fleek
Neal, R. W. (2014, March 18). Google sued for data-mining: California students claim violation of educational privacy.
IB Times. Retrieved from http://www.ibtimes.com/google-sued-data-mining-california-students-claim-violation-
educational-privacy-1562198
Pearse, D. (2012, March 17). Facebook’s “dark side”: Study finds link to socially aggressive narcissism. The Guardian.
Retrieved from https://www.theguardian.com/technology/2012/mar/17/facebook-dark-side-study-aggressive-
narcissism
Poell, T. (2015). Social media activism and state censorship. In D. Trottier and C. Fuchs (Eds.), Social media, politics
and the state: Protests, revolutions, riots, crime and policy in the age of Facebook, Twitter and YouTube (pp. 189–206).
New York, NY: Routledge. doi:10.4324/9781315764832-18.
S58 JURKIEWICZ

Polonski, V. (2017, August 9). How artificial intelligence silently took over democracy. World Economic Forum.
Retrieved from https://www.weforum.org/agenda/2017/08/artificial-intelligence-can-save-democracy-unless-it-
destroys-it-first
Pressler, J. (2017, September 20). This Stanford professor has a theory on why 2017 is filled with jerks. New York
Magazine. http://nymag.com/daily/intelligencer/2017/09/robert-sutton-asshole-survival-guide.html
Price, M. (2017, 21 September). U.S. SEC says hackers may have traded using stolen insider information. Reuters.
Retrieved from https://www.reuters.com/article/legal-us-sec-intrusion/u-s-sec-says-hackers-may-have-traded-
using-stolen-insider-information-idUSKCN1BW1K0
Quick, A. (2016, December 8). Fake news is the new normal we must defend. The Southern Illinoisan. Retrieved
from http://thesouthern.com/news/opinion/editorial/quick/quick-fake-news-is-the-new-normal-we-must-defend/
article_f161af7f-08e8-569c-a770-2303923de076.html
Riglian, A. (2012). “Big data” collection efforts spark and information ethics debate. Tech Target. Retrieved
from http://searchcloudapplications.techtarget.com/feature/Big-data-collection-efforts-spark-an-information-
ethics-debate
Roll, N. (2017, June 19). New salvo against Turnitin. Inside Higher Ed. Retrieved from https://www.insidehighered.
com/news/2017/06/19/anti-turnitin-manifesto-calls-resistance-some-technology-digital-age
Scassa, T. (2017). Law enforcement in the age of big data and surveillance intermediaries: Transparency challenges.
SCRIPT-ed, 14(2), 239–284. doi:10.2966/scrip.140217.239.
Schmidt, E., & Cohen, J. (2013). The new digital age: Reshaping the future of people, nations and business. New York,
NY: Knopf.
Seetharaman, D., & Morris, B. (2017, August 13). Facebook’s Onavo gives social-media firm inside peek at rivals’
users. Wall Street Journal. Retrieved from https://www.wsj.com/articles/facebooks-onavo-gives-social-media-
firm-inside-peek-at-rivals-users-1502622003
Siddique, H. (2017, July 5). Sex robots promise “revolutionary” service but also risks, says study. The Guardian.
Retrieved from https://www.theguardian.com/technology/2017/jul/05/sex-robots-promise-revolutionary-service-
but-also-risks-says-study
Simonite, T. (2017, July 7). Two giants of AI team up to head off the robot apocalypse. Wired. Retrieved from https://
www.wired.com/story/two-giants-of-ai-team-up-to-head-off-the-robot-apocalypse
Sorrel, C. (2016, March 15). What happens when we become a cashless society. Fast Company. Retrieved from https://
www.fastcompany.com/3056736/what-happens-when-we-become-a-cashless-society
Stanley, J. (2017, Sept. 14). Apple’s use of face recognition in the new iPhone: Implications. ACLU.
Retrieved from aclu.org/blog/privacy-technology/surveillance-technologies/apples-use-face-recognition-
new-iphone
Sulleyman, A. (2017, August 14). Facebook knows what millions of people do on their phones, even if they don’t actually
use the social network. The Independent. Retrieved from http://www.independent.co.uk/life-style/gadgets-and-tech/
news/facebook-know-smartphones-activity-what-do-not-use-social-network-account-media-privacy-security-a7892761.
html
Taylor, K. (2017, July 4). Why “cashless societies” don’t benefit the poor. World Economic Forum. Retrieved from
https://www.weforum.org/agenda/2017/07/why-cashless-societies-dont-benefit-the-poor
Thomson, S. (2015, September 15). 13 signs the fourth industrial revolution is almost here. World Economic Forum.
Retrieved from https://www.weforum.org/agenda/2015/09/13-signs-the-fourth-industrial-revolution-is-almost-here
Twenge, J. (2013, 24 September). Social media is a narcissism enabler. New York Times. Retrieved from https://www.
nytimes.com/roomfordebate/2013/09/23/facebook-and-narcissism/social-media-is-a-narcissism-enabler
U.S. Securities and Exchange Commission. (2015, Oct 9). Form S-1, Instructure, Inc. Retrieved from https://www.sec.
gov/Archives/edgar/data/1355754/000119312515341090/d932934ds1.htm
Wakefield, J. (2015, September 14). Intelligent machines: The jobs robots will steal first. BBC News. Retrieved from
http://www.bbc.com/news/technology-33327659
Walker, R. (2015). From big data to big profits: Success with data and analytics. Oxford, UK: Oxford University Press.
doi:10.1093/acprof:oso/9780199378326.001.0001.
Weiser, E. B. (2015). #Me: Narcissism and its facets as predictors of selfie-posting frequency. Personality and Individ-
ual Differences, 86, 477–481. doi:10.1016/j.paid.2015.07.007.
Wessing, T. (2014, July). Regulation of big data in the United States. Global Data Hub. Retrieved from https://united-
kingdom.taylorwessing.com/globaldatahub/article_big_data_us_regs.html
BIG DATA, BIG CONCERNS: ETHICS IN THE DIGITAL AGE S59

Wiener, N. (1954). The human use of human beings: Cybernetics and society. New York, NY: Doubleday Anchor.
Wiener, N. (1964). God & Golem, Inc.: A comment on certain points where cybernetics impinges on religion.
Cambridge, MA: MIT Press.
World Economic Forum. (2016, August 10). Here’s how your career can survive automation. Retrieved from https://
www.weforum.org/agenda/2016/08/how-you-can-ride-the-wave-of-workplace-change
Young, J. R. (2008). Judge rules plagiarism-detection tool falls under “fair use.” Chronicle of Higher Education, 54(30),
A13. Retrieved from https://www.chronicle.com/article/Judge-Rules/19218
Zimmerman, T. A. (2008). McLean stud ents file suit against Turnitin.com: Useful tool or instrument of tyranny?
Conference on College Composition & Communication. Retrieved from http://cccc.ncte.org/cccc/committees/ip/
2007developments/mclean
Copyright of Public Integrity is the property of Taylor & Francis Ltd and its content may not
be copied or emailed to multiple sites or posted to a listserv without the copyright holder's
express written permission. However, users may print, download, or email articles for
individual use.

You might also like