You are on page 1of 10

2021 the 3rd World Symposium on Artificial Intelligence

Ethics and Facial Recognition Technology: An


Integrative Review
Aimee Kendall Roundtree
Technical Communication Program
2021 3rd World Symposium on Artificial Intelligence (WSAI) | 978-1-6654-0451-8/21/$31.00 ©2021 IEEE | DOI: 10.1109/WSAI51899.2021.9486382

Texas State University


San Marcos, TX, U.S.
akr@txstate.edu

Abstract—This integrative review will synthesize current Congressional hearing, the American Civil Liberties Union
studies and research on the ethics of facial recognition technology discovered that the FBI's massive facial recognition apparatus
to raise awareness and reflection among engineering continues to expand and can now match against over 640
practitioners, teachers, and students. Engineers as developers of million photos [3]-[5]. The hearing also uncovered that the FBI
this software—and the professional communicators who help test claims it can use face recognition on individuals without a
and translate them for the public—could use ethical insights to warrant or probable cause. The Bureau does not track basic
discuss and implement user-centered design leading up to and statistics to measure the technology's efficacy, nor can it
after deployment more meaningfully. The integrative review will confirm that it complies with all constitutional obligations.
focus on academic research and studies conducted in the past ten
years about ethical issues in facial recognition to synthesize An integrative review is helpful because it synthesizes the
findings and insights toward a framework for guidelines and best latest insights to recommend guidelines and best practices. For
practices. example, engineers as developers of this software—and the
professional communicators who help test and translate
Keywords—facial recognition, ethics, integrative review software for the public—could use ethical insights to more
I. INTRODUCTION meaningfully discuss and implement user-centered design
leading up to and after deployment. The integrative review will
This integrative review will synthesize current studies and focus on academic research and studies conducted in the past
research on facial recognition technology (FRT) ethics to raise ten years about ethical issues in facial recognition to synthesize
awareness and reflection among engineering practitioners, findings and insights toward a framework for guidelines and
teachers, and students. practices.
By 2024, the global facial recognition market will grow II. METHODS
from $3.2 billion in 2019 to $7.0 billion by 2024, at a
compound annual growth rate of 16.6% from 2019 to 2024 [1]. Study characteristics included peer-reviewed articles and
The growth causes some alarm, mainly because of FRT excluded theses, primarily because they are not peer-reviewed.
inaccuracies that could widen social stigmas and disparities. The studies were published in English between 2010 and 2020.
For example, facial recognition algorithms are inaccurate in Searching ACM, IEEE, EBSCO, Scopus, Web of Science,
identifying some communities of color. The National Institute PubMed, ERIC, ScienceDirect, JSTOR, and Google Scholar
of Standards and Technology (NIST) evaluated 189 facial enabled finding articles and identifying sources. However,
recognition algorithms from 99 developers, representing most library holdings did not allow for accessing the full-text
of the industry [2]. NIST examined how well they perform versions of a few articles. Search terms included facial
"one-to-one" matching (used for verification tasks such as recognition, software, technology, morals, and ethics.
unlocking a smartphone or checking a passport) and "one-to- Screening involved reading database previews and article
many" matching (used for identification of a person of abstracts for eligibility. Once eligible studies were screened,
interest). In addition, they tracked false positives and false reading the content of the entire article revealed the extent to
negatives. A false positive means that the software wrongly which they identified ethical problems and dynamics. We
considered photos of two different individuals to show the eliminated articles that did not mention ethical issues, theories,
same person, while a false negative means the software failed or dynamics more than twice in the article. We compiled full-
to match two photos that do show the same person. Using a text articles in spreadsheets and gathered paragraphs and
sample of 18.27 million images of 8.49 million people from sentences that developed arguments or main points about
operational databases provided by the State Department, the ethical dynamics. The analysis also entailed close readings of
Department of Homeland Security, and the FBI, NIST found each included article.
higher rates of false positives in one-to-one matching for Asian
and African American faces relative to images of Caucasians. Tropes and LIWC software helped code data, sentence by
sentence, using an inductive coding analysis to handle and
Legal questions and uncertainties also complicate the interpret data and to combine the results of studies [6]-[9].
implementation of facial recognition. At a recent Tropes software analyzes written or spoken texts by reducing

978-1-6654-0451-8/21/$31.00 ©2021 IEEE 10

Authorized licensed use limited to: University of Westminster. Downloaded on October 17,2022 at 06:10:11 UTC from IEEE Xplore. Restrictions apply.
the text to essential linguistic components and matches to 20 1 2.14 6 consequentialism, intuitionism
validated dictionaries of text types. A close reading of content 21 0 0 0 deontological, situational, subjectivism
confirmed categories. LIWC includes text analysis modules 22 1 0 1 consequentialism, intuitionism
and built-in dictionaries. LIWC reads written or transcribed consequentialism, deontological,
verbal texts, compares each word in the text against a validated 23 0 0.66 3 situational
dictionary, and identifies which words are associated with 24 0 0.71 83 consequentialism, intuitionism
psychological and sentiment categories. The overall sentiment 25 1 0 1 consequentialism, virtue, intuitionism
was determined by subtracting negative from positive 26 1 0 2 deontological, situational
sentiment scores. 27 1 0.91 4 consequentialism, situational
28 0 0 3 situational, subjectivism
The meta-analysis was conducted using the software to
29 1 0 17 deontological, situational, intuitionism
categorize sentence topics and count frequencies of recurring consequentialism, subjectivism,
themes. We found pertinent sentences. We searched for 30 1 12.92 1 emotivism
keywords, gathered sentences in a spreadsheet, and analyzed consequentialism, deontological,
them using qualitative close reading and quantitative data 31 1 0 1 subjectivism
consequentialism, deontological,
analytics software. Both allowed for sentence-level topic 32 0 1.06 17 intuitionism
categorization and sentiment analysis to derive a frequency consequentialism, deontological,
count of sentences with different thematic foci. Quantitative 33 1 0 0 situational, subjectivism
analyses confirmed qualitative close reading, and vice versa, 34 1 0 0 consequentialism, deontological, virtue
consequentialism, deontological,
using iterative checking. Software counted frequencies. 35 1 0 2 situational
Topics and sentiments were identified using Tropes and 36 1 1.12 3 consequentialism, deontological
LIWC, as well as verified by close reading. Tropes used text 37 0 23.59 36 consequentialism, deontological, virtue
mining and natural language processing to analyze and consequentialism, deontological,
38 0 25.55 147 intuitionism
categorize sentences. Tropes detect content categories, consequentialism, deontological,
contexts, themes, and principal actors by applying three levels 39 1 0.17 4 emotivism
of semantic classifications. Each content category was consequentialism, deontological,
40 1 2.71 21 subjectivism
confirmed by close reading by the coder. Next, full texts of the 41 1 0 1 consequentialism, deontological, virtue
categories were analyzed using LIWC for sentiment analysis. 42 0 0.22 10 deontological, subjectivism
LIWC reads text and counts the percentage of words that consequentialism, deontological,
reflect different emotions, thinking styles, social concerns, and 43 0 0.26 4 situational
parts of speech. LIWC was used to determine the sentiment of 44 0 3.96 49 subjectivism, virtue, emotivism
content comprising categories. LIWC enlists its preset library 45 0 1.64 8 deontological, situational
validated in social, clinical, health, and cognitive psychology 46 1 59 18 consequentialism, deontological
research. The results section reports average sentiment scores consequentialism, intuitionism,
of themes and codes. 47 1 5.71 71 emotivism
48 1 0 1 deontological, subjectivism
A coder marked which school of philosophy the sentence 49 1 0 0 consequentialism
represented per categories classified using text mining software 50 0 0 0 deontological, subjectivism
to determine philosophical leanings. The same coder internally 51 0 5.04 21 consequentialism, situational
validated the codes on the second round of coding completed 52 0 0 0 consequentialism, virtue
four weeks after the first round.
53 0 0 5 consequentialism, subjectivism
III. RESULTS 54 0 1.03 15 consequentialism, intuitionism
55 0 0 0 consequentialism, situational
A. Articles Included 56 0 0 0 deontological
57 0 1.21 44 consequentialism, deontological
TABLE I. INCLUDED ARTICLES & CONCLUSION PHILOSOPHIES
58 1 0 1 consequentialism, intuitionism
ref # coi / ack fwci cited ethical philosophy of conclusions 59 0 8.36 30 consequentialism, situational
60 0 0 0 virtue, intuitionism
10 1 1.55 42 deontological, situational, intuitionism
61 0 0 0 consequentialism, deontological, virtue
11 0 0 1 consequentialism, deontological
62 1 0 1 consequentialism, intuitionism
12 1 5.22 1 deontological, situational, subjectivism
consequentialism, situational, 63 0 0.2 5 consequentialism, intuitionism
13 0 0.92 55 intuitionism
consequentialism, deontological,
14 0 0 1 situational The general search yielded 157 articles total that mentioned
15 0 0 4 consequentialism, deontological the keywords in their titles, keywords, abstract, or title. In the
16 0 1.45 33 situational second step, a survey of abstracts eliminated articles, including
consequentialism, deontological, repeats (n=26), language (n=8), access (n=14), and focus
17 0 0.59 11 emotivism
consequentialism, deontological, (n=55) of the publications. Overall, 54 articles and chapters
18 1 0 0 subjectivism were included in this review. Under half (n=23) reported
19 0 0 0 consequentialism, subjectivism, virtue conflicts of interest (COI) or acknowledgments of funding

11

Authorized licensed use limited to: University of Westminster. Downloaded on October 17,2022 at 06:10:11 UTC from IEEE Xplore. Restrictions apply.
(ACK). Most earned a low field-weighted citation impact the circumstances around FRT applications. Subjectivism
(fwci) or the ratio of the total citations received and the total underscored how the ethics of facial recognition varied for
citations that would be expected. Eight had an FWCI of 5 or different individuals or organizations. Virtue statements called
higher. Eighteen were cited ten or more times. upon the integrity of organizations or individuals rather than
the value of the application itself to estimate the ethics of facial
B. Philosophy and Sentiment recognition. Intuitionism relied on the accuracy of the
Sentences from conclusions and full text were evaluated for algorithm to determine the ethics of facial recognition. Finally,
their philosophical positioning and their positive versus emotivism estimated the ethics of facial recognition based on
negative sentiment. Table II shows the philosophical feelings toward humans or technology.
positioning and sentiment of conclusions drawn. Table III
shows philosophical positioning and sentiment of full-text Most conclusion statements assumed a consequentialist
content and themes that emerged within philosophy categories. approach (n=1262) that evaluated the technology based on its
potential outcomes. In addition, many statements took a
TABLE II. PHILOSOPHIES OF CONCLUSION STATEMENTS situational approach (n=949, where circumstances dictated
appropriateness) or a deontological approach (n=621,
philosophy n + - sample underscoring rules and guidelines). Virtue (n=565, where the
Consequentialism 1262 2.22 1.22 It is essential to weigh the relative characters of the designers or organizations using the
benefits and burdens of specific technology prevail) and intuitionism (n=409, where the
FRT uses in health care and integrity of the software or algorithm determines the ethics of
conduct research into how patients FRT) were also prevalent. There were also subjective
perceive its use [27, p. 3]. statements (n=336, which underscored the variety of personal
Situational 949 1.79 0.86 Even when data in a particular data or individual opinions on the ethics of facial recognition) and
set are genuinely anonymous, there emotive statements (n= 224, where ethical judgment and
may still be issues when that decisions were expressed in terms of feelings and emotions).
dataset is combined with other Subjectivism and emotivism were less positive overall than
datasets.[28, p. 242] consequentialism and deontological statements.
Deontological 621 1.98 0.93 The government has only recently
begun to focus on this new privacy Subtracting negative from positive scores, statements about
risk and consider the possibility of behavior, characteristics, and security were generally more
laws and regulations to protect positive than discussions of social groups, laws, and bodies.
consumers from inappropriate use Points made about feelings and fighting were both more
of facial recognition technologies negative than positive overall.
by the digital advertising
industry.[14, p. 247] TABLE III. FULL-TEXT THEMES, SENTIMENT & PHILOSOPHIES
Virtue 565 1.97 1.12 Businesses committed to ethics code # + - philosophy
and integrity are concerned with communication 3907 1.76 0.84 subjectivism
finding ways to implement computer science 2801 1.36 0.62 intuitionism
ethically appropriate standards and education 1933 1.9 0.78 virtue
practices as part of their business business 1842 1.79 0.73 consequential
activity. [10, p. 882] technology 1782 1.97 1.19 intuitionism
Intuitionism 409 1.75 0.94 In a real sense, it looks like a black time 1698 1.65 0.93 situational
box, where no knowledge about law 1566 1.81 1.44 deontological
social group 1430 1.69 1.31 situational
the process is happening in
science 1403 1.65 0.92 deontological
between, which results in
security 1264 4 2.62 consequential
hindrance in the acceptance of
media 1150 1.41 0.7 situational
biometric technology, especially in system 1057 2.08 1.15 deontological
the United States of America [29, device 1045 1.81 0.77 intuitionism
p. 3]. body 1039 1.2 0.83 subjective
Subjectivism 336 1.65 0.93 Biometrics systems can only deal health 991 2.26 1.33 situational
with people who fall within the person 990 1.83 1.23 virtue
range defined as ‘‘normal’’ by the language 953 1.86 0.79 deontological
individual system’s commissioners, fight 950 1.9 2.26 subjective
designers, and administrators. [19, behavior 932 3.12 1.37 emotivism
p. 49] control 851 1.81 0.91 deontological
Emotivism 224 2.46 1.75 [O]ur human distinctiveness is a location 765 1.77 0.96 situational
source of wonder [30, p. 4]. characteristic 738 2.79 1.23 virtue
feeling 694 3.08 3.12 emotivism
politics 667 1.63 0.92 consequential
Consequentialism evaluated facial recognition technology cognition 610 1.98 1 virtue
(FRT) based on outcomes expected from its application. organization 545 1.8 0.82 virtue
Deontological statements called for setting up rules to govern automation 530 2.29 1.46 intuitionism
its ethical application. Situational statements called attention to

12

Authorized licensed use limited to: University of Westminster. Downloaded on October 17,2022 at 06:10:11 UTC from IEEE Xplore. Restrictions apply.
C. Themes Within Philosophies bio-identification: "In contrast to other biometric forms of
Table III illustrates the frequency of most common and identification such as digitized fingerprints and iris scans,
pertinent themes that emerged in the full text of the article's facial recognition systems are inexpensive, unobtrusive, and
content and corresponding philosophical categories and can operate anonymously in the background without the active
positive or negative sentiment. consent of those under surveillance" [20, p. 251]. Using body
data without consent compromises privacy and ownership.
Consequentialism emerged in questions about the business Still, some findings suggested that the public feels more
outcomes of facial recognition. The articles were most likely to comfortable with FRT than other biometrics: "[E]ye tracking,
make the case from a business perspective (n=1842) than from biometric measurements, IAT, and facial recognition seem to
a social, behavioral, or legal perspective. Business generate fewer ethical issues" [36, p. 17].
justifications included discussing facial recognition as a benefit
or essential part of business activities and practices: "This Furthermore, issues of control (n=851) were another
technology is currently being employed by many different significant consequence. The proprietary nature of most facial
businesses and branches of the government in an attempt to recognition software--industry control of the algorithms--cast
improve security” [11, p. 2]. Business activities served as a suspicion about the software's trustworthiness: "For
factor for ethical consideration and a consequence to weigh for commercially available recognition systems, those who are
determining action. Money and financial consequences invisible may simply become cases of uncorrected error" [20,
(n=441) also factored prominently in making a case for p. 252]. Who controls the data mattered: "Amazon was
clarifying FRT ethics. The economic implications of facial selected following the revelation of the active use and
recognition mattered: “In the case of unethical behavior, an promotion of its facial recognition technology in law
organization must fully assume financial costs, but also image enforcement" [38, p. 431].
related ones that could lead to immense market share losses, Finally, the articles described how facial recognition might
reducing turnover, diminishing the whole company’s enable bad behavior and bad faith in politics (n=667) in
credibility among the target client groups” [36, p. 10]. The governments using FRT. Bad behavior includes false
upside of the argument promises diminishing customer pain incarcerations and dehumanization of citizens. For example:
points and the downside, loss of credibility and consumer trust. "Stochastic governance is made possible by Big
The security (n=1264) of facial recognition systems was Data…Stochastic governance even subjects human bodies to
another important consequence considered. Questions lingered its logic” [44, p. 4-5].
about whether the systems protected the public or took Situational critique emerged when articles weighed how
advantage of them and their data. On the one hand, facial the technology would impact different countries and settings.
recognition was proposed as more secure than passwords. Yet, Different locations (n=765) had different values and laws,
on the other hand, hacking such facial recognition software which influenced attitudes about the ethics of the technology.
would compromise public security. "[I]f your facial data is Time (n=1698) impacted the conditions of trustworthiness and
stolen [it] is not something you can change easily, making all ethics of the technology. The articles discussed how changes
future facial recognition based permission systems…subject to over time and aggregated data over time compromise the
external risk" [21, p. 87]. Furthermore, the digital and personal integrity, accuracy, and precision of the technology: “The
nature of the data would render it vulnerable to hacking, and process usually takes account of the likely object dynamics and
data "may be misused, lost, or stolen, leading to potential expected changes (or constancy) in shape and appearance over
unauthorized matching, tracking, impersonation, and other time” [10, p. 274].
deceptive practices" [16, p. 56].
Articles boasted the promise of facial recognition to help
The literature promised increased security as a potential keep schools safe. For example, "Many parents would most
benefit such as "alleged value the technology has in stopping likely feel safer knowing their children's elementary school had
crime, and arresting criminals like murderers, drug traffickers a facial recognition system to ensure that convicted child
and sexual offenders…[and] the added feeling of security that molesters were granted not access to school grounds" [13, p.
the technology may bring" [13, p. 102]. However, others 102]. The studies also discuss applications of the technology
underscored the risks to the public and their privacy and for monitoring student activity: "[S]everal schools in the
security: "Facial recognition technology will definitely benefit United Kingdom have started already using facial recognition
advertisers; however, it poses complex privacy and security systems to monitor students' attendance and for timekeeping
concerns" [61, p. 247] In particular, the software's poor tasks" [61, p. 243].
accuracy record in vulnerable populations posed more risk and
less security for those communities: "Future research should How the media reports the ethics of facial recognition was
also consider examining other potential risks of using facial also a big theme (n=1150). Facial recognition deployment in
recognition technology, such as stereotyping effects (i.e., protests raised ethical concerns: “The recent media coverage of
racism, sexism, and ageism)" [61, p. 249]. police use of facial recognition on protesters in Hong Kong has
sparked fears of persecution” [26, p. 302]. The literature also
Treatment of the body (n=1039) and body parts emerged. described how the media inflates the touted value and virtues
Facial recognition renders physical characteristics as data: "The of facial recognition software: “The ethics of the claims
most current data suggests that eyebrows are more important regarding the performance and effectiveness of facial
than the eyes" [42, p. 3]. Some literature compared how facial recognition systems by the media, and those companies selling
recognition technology processed the body to other forms of the systems, merits some consideration” [11, p. 7].

13

Authorized licensed use limited to: University of Westminster. Downloaded on October 17,2022 at 06:10:11 UTC from IEEE Xplore. Restrictions apply.
Facial recognition also factored prominently in health Deontological concerns emerged about general and
(n=991). For example, articles proposed using it "for industry laws, policies, and rules that might help improve the
managing a patient's adherence to medication…or to gauge the ethical integrity of the technology. Although legal
patient's pain level" [22, p. 1]. Another proposed using it for considerations are not always the same as ethical ones, the
"prediction as to who will benefit…the most in the long run or literature framed ethical concerns about facial recognition in
who is most likely to have certain diseases" [22, p. 4]. terms of the law (n=1566). Facial recognition was
contemplated for law enforcement in terms of decreasing
However, the literature focused more on public rather than violence and increasing security: "Police forces and national
clinical deployments of facial recognition technology. "As a security agencies in the U.S., the United Kingdom, Singapore,
case study, the Faceit facial recognition engine of Identix South Korea and elsewhere are experimenting with facial
Corporation will be analyzed, as well as its use in Smart video recognition to combat violent crime and tighten border
surveillance (CCTV) systems in city centers and airports" [61, security" [44, p. 94]. The articles suggested that governments
p. 97]. It also discussed some personal and voluntary pass legislation for facial recognition. "Based on the findings
deployment for quality of life, such as dating services. "In of future research policy-makers will be able to understand
addition, a dating service named Findyourfacemate.com uses a potential harms of using facial recognition advertising before
sophisticated facial recognition tool to match potential partners setting up guidelines or regulations" [61, p. 249]. Industries
based on the psychological theory that people with similar themselves are also working out self-regulation and guidelines.
facial features are attracted to each other" [61, p. 244]. Industry-established privacy guidelines should "include the
Differences existed between culture and countries matter in rules that 'Clear signage has been posted throughout the
how the ethics of FRT is perceived: "A startup signed a deal area'...[and] that 'Smart CCTV' is in use" [13 p. 103]. Industries
with the Zimbabwe government to harvest the faces of millions should also disclose how they gather and use databases of
of citizens through unprecedented access " [37, p. 148]. FRT is images. "The images in the database are those of known
acceptable in some countries or cultures more so than in others. offenders; non-matching images are discarded from the system
The literature called for more research and investigation into once the comparison has been conducted" [13 p. 103].
public perception of facial recognition, given the cultural Rigors and procedural logic of the scientific method and
differences and ethical risks. "[F]uture studies should attempt research methodology (n=1403) also helped the articles
to identify consumers both positive and negative demonstrate the problems with the technology and propose
perceptions…[and] investigate how to protect vulnerable further inquiries to help improve the technology: “It would be
market segments, such as children and teenagers, from highly best to understand this through some detailed study of the logic
sophisticated facial recognition advertising" [61 p. 249]. and operation of these algorithms in diverse settings with
The literature focused primarily on public rather than diverse databases” [24, p. 80]
private deployments of facial recognition technology. "[A] Articles noted that the ethics of FRT hinged on the rules of
facial recognition engine will be analyzed, as well as its use in the social and organizational systems (n=1057) that use them.
Smart video surveillance (CCTV) systems in city centers and Even if the programming is ethical, the rules and practices of
airports" [13 p. 97]. It also discussed some personal and society might still lead to unethical deployments: "It is
voluntary deployment for quality of life. For example, it therefore important to consider the whole system when
described dating services that "uses a sophisticated facial analyzing a facial recognition system” [14, p. 45].
recognition tool to match potential partners based on the
psychological theory that people with similar facial features are Language (n=953) also revealed or obscured rules of
attracted to each other" [61 p. 244]. In the literature, engagement with facial recognition technology. First, articles
differences between culture and countries matter in how the stressed the importance of making transparent and clear to
ethics of facial recognition is perceived: "[O]rganizations in users the terms and conditions of data storage and the use and
Romania are more familiarized with traditional research limitations of the technology. "We suffer from consent fatigue,
techniques than with those that are used in neuromarketing" which is an overload of information on terms and conditions
[36, p. 139]. FRT is acceptable in some countries or cultures we are not able to sort through anymore” [33, p. 121]. Second,
more so than in others. if the syntax and subroutines of facial recognition programs
themselves are black boxes inaccessible to all stakeholders,
Regarding social groups (n=1430), questions lingered about then it poses ethical problems. It “is impossible to inspect that
use in public places: “[T]he system is operated by the city code in operation, as it becomes implemented through multiple
police department…for routine surveillance, meaning that layers of translation for its execution” [24, p. 77].
people in the area will be routinely scanned and have their
faces searched in a database” [13 p. 100]. Virtues of programmers and the organizations where they
worked were called into question. Facial recognition was
Ultimately, media coverage (n=1150) reflects the organizational but also personal (n=990). Facial recognition
differences and variations of attitudes around the world about might accurately detect personal characteristics and interpret "a
FRT: "[In the] wider media debate…proponents typically person's emotions based on his facial movements" [36, p.16].
argue that the technology has significant security benefits and However, such predictions are superficial or based on affect.
minimal privacy losses...Opponents typically argue that the Facial recognition and other biometric technologies reduce
security benefits are overestimated, and the privacy losses are people to biological characteristics such as "individual features
underestimated” [13, p. 101]. and continuous analysis of unique body dynamics" [32, p.

14

Authorized licensed use limited to: University of Westminster. Downloaded on October 17,2022 at 06:10:11 UTC from IEEE Xplore. Restrictions apply.
262]. In addition, studies stressed that FRT must account for predict and intuit insights. Some literature suggested that FRT
subtle differences in individually and collectively held values could achieve high accuracy: "Facial recognition is touted as
and morals: "There may be cases where multiple users need to the 'holy grail' of digital surveillance (Introna & Wood, 2004)"
set multiple preferences" [46, p. 584]. [20, p. 251]. Other literature suggested that the complexity and
variability of bodily information prevented true FRT accuracy:
They also pointed out that the individual biometrics and "Current facial recognition technology has not fully matured
characteristics (n=738) that differentiate one face from enough to perfectly identify an individual by analyzing his
another—the differences that FRT exploits—constitute facial information." [61, p. 243]. Changes in features pose
personal identity. Therefore, they should be highly regarded difficulties for accuracy: " [The] machines were not adequately
and protected in terms of ethical deployments of FRT: "Data equipped to deal with simple––from a human perspective––
management is a sensitive topic among research managers changes in appearance" [57, p. 49].
because a possible "leak" of information (confidential data
regarding socio-demographic characteristics of participants, Other aspects of computer science (n=2801) limited FRT's
answers, attitudes, behaviors, etc.) could ruin companies' image capacity to intuit accurate insights. For example, articles
and credibility among current and/or prospective customers” criticized technology based on homogeneous images because
[36, p. 134]. they lead to false positives and negatives for excluded
populations. "[M]ost of these databases include a limited
Different morals of organizations (n=545) involved might number of subjects with many replications under niche
hinder consensus. Each has different priorities for FRT ethics. conditions and no standard baseline control images” [12, p. 3].
They included civil rights groups, businesses, governmental
branches, advertisers, and professional organizations Articles describe techniques (n=1782) for increasing FRT
representing these stakeholders, among others. [11, 13, 20, 38, trustworthiness, such as masking: "A technique to
57, 61]. However, the stakeholders discussed were automatically perform such blurring or masking of faces” [10,
overwhelmingly from North America (n=1219) and Europe p. 280]. Still, others ascribed ethical agency to FRT devices
(n=916) more so than in Asia (n=289), Africa (n=62), and (n=993). Sometimes they found devices virtuous. “[T]he rare
developing countries or emerging economies therein. disease world now has a powerful tool that…may reap
significant benefits” [30, p. 659]. Sometimes they found them
Still, there was skepticism about the capacity of industry dubious. "[W]e must also consider the ethics of these devices
ethical standards to overcome industry self-interest: in a shifting technological/surveillance culture” [23, p. 167].
"Commercial facial recognition algorithms may further
entrench unregulated corporate actors in the operations of the In many cases, articles described how FRT enables
criminal justice system” [20 p. 253]. There was worry about automation (n=530) in robots, which bestows them with some
governmental invasion of privacy and overreach in gathering degree of agency and potential for ethical behavior, actions,
public data: "Next there is the question of government invasion decision making, and abuse. "[O]nce therapeutic robots leave
of privacy through the use of facial recognition technology. the laboratory and clinical setting, they will no longer be
Current legal doctrine…holds that there is little or no regulated for ethical oversight" [46, p. 577].
expectation of privacy in public…This view does not account
for the advancement of technology and the implications it Subjectivism also emerged in articles about FRT ethics.
portends for privacy" [11 p. 7]. Questions also lingered about Many underscored how very different individual perspectives,
the security of facial recognition in law and order. Take, for values, and morals comprised the conversation.
example, the prospect of hacking, photo-editing, and fakes. "If Communication (n=3907) called for open debates involving
Deep Fake or misleading algorithms have found their way into diverse opinions about the matter: “This debate must consider
the A.I. used by the police, then it is an easy matter for the all the potential impacts of this technology and must be
software to mislead the police to believe that an innocent conducted in an open manner" [14, p. 45]. In addition, it
bystander is a criminal" [22, p. 7]. included calls for disclosing terms of use and requesting
Education (n=1933) was recommended for improving the consent from all stakeholders: "[I]ndividuals included in these
virtue of all stakeholders involved. The literature called for face databases have not provided consent for that inclusion”
more training to improve FRT algorithms: "[S]cholarship on [12, p. 13]. But communication also revealed the process of
the impact of algorithmic audits on increasing algorithmic translating and turning faces and other biometrics into
fairness and transparency in commercial systems is nascent" information: “[O]ther biometric representations of your body a
[38, p. 429]. Articles recommended training on code of ethics reflection of your identity, are personal information” [16, p.
and conduct for development teams, for example, developing 57].
"educational material on fairness considerations for their The literature framed the ethical issues as a debate or fight
developer or enterprise clients" [38, p. 433]. The articles (n=950) between dichotomous positions. On the one hand,
stressed that every stakeholder should have cognition (n=594) facial recognition fought crime and threats to public safety:
and understanding of the ethical issues: "Consequently, there "The Department of Homeland Security has spent millions of
needs to be a greater awareness and concern for how such dollars on cameras with facial recognition capabilities in an
content is managed and handled” [18, p. 4]. attempt to identify potential threats to the American people"
Intuitionism emerged as a feature of FRT algorithms [11, p. 2]. On the other, it harmed personal privacy: "The threat
themselves. According to the articles, whether FRT is ethical to posed to privacy by facial recognition technology far
some extent depends on the algorithm’s accuracy and ability to outweighs any possible benefits of the technology" [11, p. 8].

15

Authorized licensed use limited to: University of Westminster. Downloaded on October 17,2022 at 06:10:11 UTC from IEEE Xplore. Restrictions apply.
Emotivism, or using feelings as a primary determiner of to setting--clinical vs. public vs. personal applications
ethics, also emerged in the literature. Behavior (n=932) of presumed different definitions of public vs. private, as did
facial recognition software also mattered to ethical differences in international and multicultural applications.
calculations. The literature proposed that the software could Questions about the tradeoff between privacy and claims of
interpret human sentiment, making the software complicit in FRT increasing safety persisted.
unethical behavior. "[T]he important features for human facial
recognition…are the eyes, mouth, and nose, although the eyes Debates about privacy as an inherent value were also
and eyebrows were treated as one unit" [42, p. 2]. complicated because privacy has legal definitions and
assurances. It may be the case that updates to legal definitions
The extent to which facial recognition involved human and precedence on privacy and safety will evolve to complicate
feelings (n=694) also mattered to whether the software was matters further. In addition, hacking and other cybercrimes
considered ethical. Per the literature, the variety of human expose the inherent vulnerability of digital technologies. Those
emotions poses a real challenge for FRT's technical and moral vulnerabilities, as well as regulations that poorly manage those
side. The variety might impact accuracy and usability. "There technologies, degrade public trust in them.
may be cases where multiple users need to set multiple
preferences—a feature that could be enabled by facial The plurality of ethical approaches emerged as part of the
recognition possible in many therapeutic robots" [46, p. 584]. problem. Different stakeholders held different perspectives.
These general differences are compounded by cultural Different perspectives assign different valuations and
differences in facial expressions that might further impact definitions of privacy, freedom, safety, harm, and risk. It is
accuracy and usability: "[E]motions are cultural hard to form consensus and create guidelines and regulations
products...Emotions are not universal and do not exist by when stakeholders have different attitudes about the
themselves. Facial moves that are interpreted as smile and trustworthiness of industry and government and the
happiness in one setting may well be interpreted rather vulnerabilities of FRT. In addition, legal and ethical questions
differently in another context" [62, p. 603]. Still, articles also about the duration and nature of data ownership and consent
suggested not allowing fear to thwart investigations of FRT. remain unresolved, core issues to ethical deliberations, insofar
"We should not let the fear of potential but inchoate threats to as an initial opt-in might compromise and entangle users in
privacy…deter us from using facial recognition where it can ways that make disentanglement and opt-outs difficult if not
produce positive benefits" [13, p. 103]. impossible. Public, open debates are necessary for consensus-
building and policymaking.
IV. DISCUSSION AND CONCLUSION When businesses, governmental agencies, and
The literature presents a robust, diverse, meaningful organizations dominate the discussion about the ethics of facial
conversation about the ethics of facial recognition technology. recognition, the conversation tends to abstract notions of
However, relatively low average citations and field weighted benefit, harm, safety, and risk. However, the technology itself
citation index scores of the literature show that we should pay is implicitly personal, insofar as it creates calculations that can
more attention to the subject matter and give it more regard. In identify the unique dimensions of a person's body and face.
addition, few of the articles indicated acknowledgments or Thus, the conversation often presumed that the algorithms
conflicts of interest. This insight might show that we need to could surmise people's emotions and identity. However, the
improve reporting on FRT research and scholarship. It might unstated assumptions underpinning these principles and claims
also mean that the conversation does not suffer from undue are reductive. They assume that insights are generalizable and
influence from industry. However, given the importance of consistent enough to trust in diagnoses of emotions and
having a well-rounded conversation that represents all medical conditions from facial characteristics. The literature
stakeholders, it might also be in the best interests of the also posed concerns about these reductions, particularly since
conversation to cultivate, include, and report more the technology has generated more false positives and
collaborations between non-profits, universities, governments, negatives in vulnerable populations.
and industry partners.
Furthermore, algorithms and machine learning are
The findings reveal important insights about the dimensions deployed to make facial recognition more perceptive and
of ethical thinking as it pertains to facial recognition. First, the accurate. In that case, the algorithm itself makes ethical
literature represented the business case for facial recognition decisions alongside its programmers and, eventually, perhaps
technology, as did the potential financial and economic independently. This nonhuman autonomy adds another layer of
implications. The question persisted whether facial recognition complexity in ethical and legal evaluations, especially when
compromised consumer trust more so than it provided them the technology makes mistakes. We must assess the liability
convenience. Second, it discussed the value of facial and blame of nonhuman agents. Here, work on ethical
recognition in school and health settings without discussing the deliberation within actor networks might lend insight [64].
implications of maintaining and using visual data on minors
Adding nonhuman agents in the mix complicates how we
and other vulnerable populations such as students and patients.
evaluate ethical dilemmas. Nonhuman agents are becoming
Finally, the literature questioned the validity of media claims
partners in decision-making. Because those decisions impact
about the benefits of facial recognition.
humans, it raises the stakes of program integrity and quality
The issue of privacy dominated the discussion of social and [64]. Developers can make explicit and transparent moral
public deployment of facial recognition. The literature judgments baked into the decision models that algorithms
presented several definitions of privacy that varied from setting manifest. They can design and execute code that mitigates

16

Authorized licensed use limited to: University of Westminster. Downloaded on October 17,2022 at 06:10:11 UTC from IEEE Xplore. Restrictions apply.
dangerous generalizations and moves toward empathy and Application Area (Emotion Recognition, Access Control, and Law
justice and make transparent the logic underpinning nonhuman Enforcement), Vertical, and Region - Global Forecast to 2024," Markets
and Markets. July 2019. [Online]. Available:
agent decision making. They can resist assumptions such as the https://www.marketsandmarkets.com/Market-Reports/facial-
inevitability of bad action or social harm that would relinquish recognition-market-995.html [Accessed Oct. 8, 2020].
their responsibility to interrogate and improve algorithm [2] National Institute of Standards and Technology (NIST), "NIST Study
generalizations and design. Evaluates Effects of Race, Age, Sex on Face Recognition Software,"
National Institute of Standards and Technology. Dec. 2019. [Online].
Medicine, education, and law and order were the industries Available: https://www.nist.gov/news-events/news/2019/12/NIST-study-
most cited as ripe for facial recognition implementations. The evaluates-effects-race-age-sex-face-recognition-software [Accessed Oct.
literature mentioned seemingly intuitive applications of facial 8, 2020].
recognition in criminal profiling and disease diagnosis. [3] C. Maloney, "Facial Recognition Technology (Part 1): Its Impact on our
However, these industries very intimately affect people's lives Civil Rights and Liberties," House Committee on Oversight and Reform.
May 2019. [Online]. Available:
and livelihoods. Unfortunately, the literature offered more https://oversight.house.gov/legislation/hearings/facial-recognition-
hypothetical scenarios than generalizable evidence to anchor technology-part-1-its-impact-on-our-civil-rights-and [Accessed Oct. 8,
claims about the harm, benefit, risk, and safety of facial 2020].
recognition. The opinions and scenarios included some positive [4] C. Maloney, "Facial Recognition Technology (Part II): Ensuring
sentiment around benefits for business and finance and Transparency in Government Use," House Committee on Oversight and
negative sentiment around implications and handling of facial Reform. June 2019. [Online]. Available:
https://oversight.house.gov/legislation/hearings/facial-recognition-
recognition as it pertains to human factors such as emotions technology-part-ii-ensuring-transparency-in-government-use [Accessed
and perception. Oct. 8, 2020].
[5] C. Maloney, "Facial Recognition Technology (Part III): Ensuring
Most articles considered the consequences of FRT as key to Commercial Transparency & Accuracy," House Committee on
ethical decision-making. To help ensure ethical deployments, Oversight and Reform. Jan. 2020. [Online]. Available:
the literature also most often talked about setting up rules and https://oversight.house.gov/legislation/hearings/facial-recognition-
policies, considering the circumstances of deployments, and technology-part-iii-ensuring-commercial-transparency [Accessed Oct. 8,
providing ethical training to developers and stakeholders. The 2020].
literature admitted that no one ethical approach could address [6] Tropes. Semantic Knowledge. 2014. Available: https://www.semantic-
knowledge.com/tropes.htm [Available: May 30, 2021].
the complex problems that FRT presents. But it also urged the
importance of preserving and prioritizing open debate, [7] M.V. Dias and L.P.L. Mercado, "An instrument for the assessment of
learning in online education from the content analysis," in 2016
transparency, and human autonomy, particularly for the sake of International Symposium on Computers in Education (SIIE), IEEE,
vulnerable populations. Thus, we must make FRT ethical 2016, pp. 1-6.
decisions transparent, collaborative, and human-centered. [8] P. Molette and A. Landré, Tropes, Version VF 8.4. [Software], 2014.
Finally, consequentialism was the most common [9] Y.R. Tausczik and J.W. Pennebaker, "The psychological meaning of
words: LIWC and computerized text analysis methods," Journal of
philosophical framing of the ethics of FRT, and safety was Language and Social Psychology, vol. 29, no. 1 (2010): 24-54.
touted as one of its essential virtues. Therefore, future studies [10] A.A. Adams and J.M. Ferryman, "The future of video analytics for
should investigate the real-world benefits and harms of facial surveillance and its ethical implications," Security Journal, vol. 28, no. 3
recognition applications. Benefits and harms factored (2015): 272-289.
prominently in consequentialist approaches to facial [11] D. Avexander and J. Richert-Boe, "Ethics of Facial Recognition
recognition ethics. However, statements about FRT’s benefits Technology," Ethica Publishing, 2011. Available: https://bit.ly/3ijLD9t
were anecdotal, more so than evidence-based or generalizable. [Accessed Oct. 8, 2020].
Since many of the studies framed the ethics of facial [12] N. Bacci, J. Davies, M. Steyn, and N. Briers, "Development of the Wits
Face Database: an African database of high-resolution facial
recognition as situational and subjective, future studies should photographs and multimodal closed-circuit television (CCTV)
further investigate public perception using qualitative and recordings," F1000Research 10, 2021.
quantitative methods. Furthermore, since much of the framing [13] P. Brey, "Ethical aspects of facial recognition systems in public places,"
gestured toward trusting the rules, the character of the Journal of Information, Communication and Ethics in Society, no. 2, pp.
stakeholders, or the integrity of the algorithm itself, then future 97-109, 2004.
studies should investigate evidence of regulatory compliance [14] C. Castelluccia, D. Le Métayer, "Position Paper: Analyzing the Impacts
and make more transparent calculations of industry of Facial Recognition," in Annual Privacy Forum, New York:Springer,
stakeholders and the algorithms themselves. Future work Cham, 2020, pp. 43-57.
should also outline frameworks for collaborative, open, and [15] D. Castelvecchi, "Is facial recognition too biased to be let loose?"
Nature, vol. 587, no. 7834, pp. 347-349, 2020.
transparent decision-making, and consensus-building between
[16] A. Cavoukian, M. Chibba, A. Stoianov, "Advances in biometric
all FRT stakeholders, including designers, executives, encryption: Taking privacy by design from academic research to
politicians, government officials, and the public. deployment," Review of Policy Research, vol. 29, no. 1, pp. 37-61, 2012.
[17] K. Crawford, "Halt the use of facial recognition technology until it is
ACKNOWLEDGMENT regulated," Nature, vol. 572, no. 7771, pp. 565-566, 2019.
This work was sponsored by a research gift from the NEC [18] B. Dieterle, "People As Data?: Developing an Ethical Framework for
Foundation. Feminist Digital Research." Computers and Composition, vol. 59, 2021.
[19] Editors, “Facial recognition research needs an ethical reckoning,”
REFERENCES Nature, Nov. 18, 2021. [Online]. Available:
https://www.nature.com/articles/d41586-020-03256-7 [Accessed May
[1] A. Methot, "Facial Recognition Market by Component (Software Tools
29, 2021].
(2D Recognition, 3D Recognition, and Facial Analytics) and Services),

17

Authorized licensed use limited to: University of Westminster. Downloaded on October 17,2022 at 06:10:11 UTC from IEEE Xplore. Restrictions apply.
[20] W. Espeland, V. Yung, "Ethical dimensions of quantification," Social [41] L.A. Ricciardelli, S. McGarity, and L. Nackerud, "Social work
Science Information, vol. 58, no. 2, pp. 238-260, 2019. education and the recognition of rights in the digital tech age:
[21] M Esposito, J. Entsminger, L. Xiong, "A Manager's Introduction to A.I. implications for professional identity," Social Work Education, pp. 1-15,
Ethics," in New Leadership in Strategy and Communication, N. 2020.
Pfeffermann, Ed., New York:Springer, Cham, 2020, pp. 81-92. [42] E. Roberts, C. Troiano, and J.H. Spiegel, "Standardization of guidelines
[22] S. Hongladarom. "Machine hermeneutics, postphenomenology, and for patient photograph deidentification," Annals of Plastic Surgery, vol.
facial recognition technology," AI & SOCIETY, pp. 1-8, 2020. 76, no. 6, pp. 611-614, 2016.
[23] J. Hood, “Making the body electric: The politics of body-worn cameras [43] A. Rossi, "Resisting the rise of facial recognition," Nature, vol. 587, no.
and facial recognition in the United States," Surveillance & Society, vol. 7834, pp. 350-353, 2020.
18, no. 2, pp. 157-169, 2020. [44] C.B. Sanders and J. Sheptycki, "Policing, crime and ‘big data’; towards
[24] L.D. Introna, "Disclosive ethics and information technology: Disclosing a critique of the moral economy of stochastic governance," Crime, Law
facial recognition systems," Ethics and Information Technology, vol. 7, and Social Change, vol. 68, no. 1, pp. 1-15, 2017.
no. 2, pp. 75-86, 2005 [45] E. Santow, "Emerging from A.I. Utopia." Science, vol. 368, no. 6486,
[25] T. Jarvis, D. Thornburg, A.M. Rebecca, and C.M. Teven, "Artificial pp. 9, 2020.
Intelligence in Plastic Surgery: Current Applications, Future Directions, [46] E. Seidenberg, J. Chuang, and D. Mulligan, "Designing commercial
and Ethical Implications," Plastic and Reconstructive Surgery Global therapeutic robots for privacy-preserving systems and ethical research
Open, vol. 8, no. 10, 2020. practices within the home," International Journal of Social Robotics,
[26] S.A. Javadi, R. Cloete, J. Cobbe, M.S. Ah Lee, and J. Singh, vol. 8, no. 4, pp. 575-587, 2016
"Monitoring Misuse for Accountable'Artificial Intelligence as a [47] S. Serholt, W. Barendregt, A. Vasalou, P. Alves-Oliveira, A. Jones, A.
Service'," in Proceedings of the AAAI/ACM Conference on A.I., Ethics, Petisca, and A. Paiva, "The case of classroom robots: teachers’
and Society, ACM, 2020, pp. 300-306. deliberations on the ethical tensions," AI & Society, vol. 32, no. 4, pp.
[27] R. Jenkins, Z.I. Rentz, and K. Abney, "Big Brother Goes to School," 613-631, 2017.
Techné: Research in Philosophy and Technology, vol. 25, no. 1, 2021. [48] T. Sharon and B.J. Koops, "The ethics of inattention: revitalising civil
[28] K. Macnish, "Privacy in research ethics," Handbook of Research Ethics inattention as a privacy-protecting mechanism in public spaces," Ethics
and Scientific Integrity, pp. 233-249, 2020. and Information Technology, pp. 1-13, 2021.
[29] N. Martinez-Martin, "What are important ethical implications of using [49] P. Skeba and E.P.S. Baumer, "Informational Friction as a Lens for
facial recognition technology in health care?" AMA Journal of Ethics, Studying Algorithmic Aspects of Privacy," Proceedings of the ACM on
vol. 21, no. 2 pp. E180, 2019 Human-Computer Interaction, vol. 4, no. CSCW2, pp. 1-22, 2020.
[30] M.D. McCradden, E. Patel, and L. Chad, "The point‐of‐care use of a [50] M. Smith and S. Miller, "The ethical application of biometric facial
facial phenotyping tool in the genetics clinic: An ethics tête‐a‐tête," recognition technology," AI & Society, pp. 1-9, 2021.
American Journal of Medical Genetics Part A, vol. 185, no. 2, pp. 658- [51] S. Tanwar, S. Tyagi, N. Kumar, and M. S. Obaidat, "Ethical, legal, and
660, 2021. social implications of biometric technologies," in Biometric-based
[31] D. Mery, "Face Analysis: State of the Art and Ethical Challenges," in physical and cybersecurity systems, M. S. Obaidat, I. Traore, et al., Eds.,
Pacific-Rim Symposium on Image and Video Technology, New York: New York:Springer, Cham, 2019, pp. 535-569.
Springer, Cham, 2019, pp. 14-29. [52] M. Trim, "Essentialism is the enemy of the good," Acm Sigcas
[32] E. Mordini, and H. Ashton, "The transparent body: Medical information, Computers and Society, vol. 49, no. 2, 2021.
physical privacy and respect for body integrity," in Second Generation [53] J. Tromp, C. Le, B. Le, and D.N. Le, "Massively multi-user online social
Biometrics: The Ethical, Legal and Social Context, E. Mordini and D. virtual reality systems: ethical issues and risks for long-term use," in
Tzovaras, Eds., pp. 257-283. New York:Springer, 2012. Social Networks Science: Design, Implementation, Security, and
[33] V. Nabbosa and C. Kaar, "Societal and Ethical Issues of Digitalization,” Challenges, New York:Springer Cham, 2018, pp. 131-149.
in Proceedings of the 2020 International Conference on Big Data in [54] R. Van Noorden, "The ethical questions that haunt facial-recognition
Management, 2020, pp. 118-124. research,” Nature, vol. 587, no. 7834, pp. 354-358, 2020.
[34] V.L. Nabbosa, "Me Too: Value Creation by Digitalization and Data [55] R. Van Noorden, "What scientists really think about the ethics of facial
Privacy," in Proceedings of the 4th International Conference on E- recognition research." Nature, 2020.
Education, E-Business, and E-Technology, 2020, pp. 20-24. [56] V. Wati, K. Kusrini, H. Al Fatta, and N. Kapoor, "Security of facial
[35] I. Nesterova, "Mass data gathering and surveillance: the fight against biometric authentication for attendance system," Multimedia Tools and
facial recognition technology in the globalized world," in SHS Web of Applications, pp. 1-22, 2021
Conferences, vol. 74, EDP Sciences, 2020, pp. 03006. [57] J. Wickins, "The ethics of biometrics: the risk of social exclusion from
[36] N.A. Pop, D. Dan-Cristian, and A.M. Iorga, "Ethical Considerations the widespread use of electronic identification,” Science and
Regarding Stakeholders in Neuromarketing Research Empirical Insights Engineering Ethics, vol. 13, no. 1, pp. 45-54, 2007.
from NMSBA Corporate Members, TAAN Advertising Agencies and [58] D.P. Williams, "Fitting the description: historical and sociotechnical
Romanian Companies," in Ethics and Neuromarketing, New elements of facial recognition and anti-black surveillance," Journal of
York:Springer, Cham, 2017, pp. 123-146. Responsible Innovation, pp. 1-10, 2020.
[37] I.D. Raji, T. Gebru, M. Mitchell, J. Buolamwini, J. Lee, and E. Denton, [59] J. Winter, "Algorithmic discrimination: Big data analytics and the future
"Saving face: Investigating the ethical concerns of facial recognition of the Internet," in The Future Internet, New York:Springer, Cham,
auditing," in Proceedings of the AAAI/ACM Conference on A.I., Ethics, 2015, pp. 125-140.
and Society, 2020, pp. 145-151.
[60] M.I. Zarkasyi, M.R. Hidayatullah, and E.M. Zamzami, "Literature
[38] I.D. Raji and J. Buolamwini, "Actionable auditing: Investigating the Review: Implementation of Facial Recognition in Society," in Journal of
impact of publicly naming biased performance results of commercial ai Physics: Conference Series, vol. 1566, no. 1, New York:IOP Publishing,
products," in Proceedings of the 2019 AAAI/ACM Conference on A.I., 2020, p. 012069.
Ethics, and Society, 2019, pp. 429-435.
[61] S. Yoo, "Ads Are Watching You: Advertising Applications of Facial
[39] P. Rashidi, D.A. Edwards, and P.J. Tighe, "Primer on machine learning: Recognition Technology and Communication Ethics," in Ethical Issues
utilization of large data set analyses to individualize pain management,” in Communication Professions, M. Drumwright, Eds., New
Current Opinion in Anesthesiology, vol. 32, no. 5, pp. 653-660, 2019. York:Routledge, 2013, pp. 264-274.
[40] D.B. Resnik and K.C. Elliott, "Using drones to study human beings: [62] N. Grandjean, M. Cornélis, and C. Lobet-Maris, "Sociological and
ethical and regulatory issues," Science and Engineering Ethics, vol. 25, Ethical Issues in Facial Recognition Systems: Exploring the Possibilities
no. 3, pp. 707-718, 2019. for Improved Critical Assessments of Technologies?" In 2008 Tenth

18

Authorized licensed use limited to: University of Westminster. Downloaded on October 17,2022 at 06:10:11 UTC from IEEE Xplore. Restrictions apply.
IEEE International Symposium on Multimedia, IEEE, 2008, pp. 602- expression information: protocol for a systematic review and meta-
606. analysis." Medicine, vol. 97, no. 49. 2018.
[63] D. Liu, D. Cheng, T.T. Houle, L. Chen, W. Zhang, and H. Deng, [64] A.K. Roundtree. "ANT Ethics in Professional Communication: An
"Machine learning methods for automatic pain assessment using facial Integrative Review," American Communication Journal, vol. 22, no. 1,
2020.

19

Authorized licensed use limited to: University of Westminster. Downloaded on October 17,2022 at 06:10:11 UTC from IEEE Xplore. Restrictions apply.

You might also like