You are on page 1of 8

An Unexpected Societal Critic

How Facial Recognition has Unearthed Foundational Flaws in Society While Teetering between Villain and Hero

Throughout history, technologies’ exponential growth and the impact it has on our

lives has brought up many discussions about privacy and safety. Especially with the internet

coming to life in the early 90’s, we’ve found ourselves disconnected from the persona we

create online even though the abilities the technology gives do indeed affect our everyday

lives. Facial Recognition has created one of the most direct links between our online persona

and our outward life. Instead of a VPN or IP address that we may hid behind online, facial

recognition links us by our individual faces and creates a full map of our life. It has a spot in law

enforcement, safety, and even health, however, in every aspect it may help, there are also ways

it may be abused. The technology continues to grow while our guidelines are stunted. Facial

Recognition has, yet again, opened the conversation of privacy and what really is ‘private’

within our lives. Personally, I believe that Facial Recognition is a wonderful tool, but, much like

any tool, it needs to be used properly and with consent for it to truly benefit the ones it affects.

One of the biggest concerns with Facial Recognition is the usage of it within the justice

system or on a governmental level. China has shown both great benefits and drawbacks of

having the facial recognition technology in their daily lives and is one example of how the

technology can be used in the future in other countries. There are moments that China’s

usage of facial recognition brings forth safety and justice. On one particular bridge in the city

of Xiangyang has a bridge that many jaywalkers and speeding cars intersect. Using facial

recognition, the government publicly shamed those that jaywalked by displaying them on a
large billboard next to the very intersection they chose to jaywalk in. The public shaming

includes a photo of the jaywalker in the midst of their crime along with their name and

governmental number. Although the system isn’t very fast, showing jaywalkers from four to five

days prior instead of instantly, the public shaming has been enough to decrease jaywalking

rates and thus made the intersection safer.

The Chinese government has also used facial recognition to distribute resources

evenly, making sure that everyone gets what they need and no one takes extras. One example

is with public restrooms and distributing toilet paper. Often, people will take large quantities,

if not an entire roll, for themselves to bypass buying some for themselves. Using facial

recognition, a dispenser knows who has already received toilet paper and will refuse to give

any extra within a certain time frame. The system lowers the over usage and theft of the

resources just by scanning someone’s face.

In China, Facial recognition is even used to make someone’s face their wallet, more or

less. It connects the user’s face to their bank accounts, library cards, credit cards, and passport

all to their faces. ‘Smile to Pay’ is a common way to pay in stores within larger cities. Simply

look into the monitor at the cashier counter and your groceries are paid for by using your face.

Many citizens are excited by the technology and find it to be fast and reliable. Passports are

now being verified using just your face around the world. Smiling into a camera has never

meant so many things before, but now you can pay for a snack and board a plane just by

looking into the lens.

While the benefits are tempting, they are the gateway to the many ways facial

recognition can be abused. China is already a rather fickle country, closely monitoring what is

said or done against it. While facial recognition can give you access to many things, it can be
used to take that access away. Speak ill of the government and suddenly you can no longer

pay for groceries, your passport is invalid, and you are no longer allowed to travel out of a

specific radius of your home, if you do, they will know immediately. While a majority of the

time this shut-down of privileges is directed for ‘criminals’, the Chinese government’s

definition of what is criminal is could really be just an unhappy worker that talked smack once.

There is also a major racial injustice within China against Uyghur Muslims that facial

recognition has only amplified. The system is used to identify and track Uyghur Muslims down

to be placed in literal concentration camps. The system is used to often target minorities and

there is no safe way to scrutinize this usage without becoming another target (Ng, 2020). It

isn’t the systems fault; however, the users of such system are the ones to dictate the usage of

this tool.

The tool itself, despite the user or usage, is bias from the start. Facial Recognition

needs to be trained in order to identify that there is indeed a face before it can learn who that

face belongs to. This can be seen within it’s lack of accuracy within specific groups of people.

In China, the training material is mostly those of the Asian population, any other ethnicities are

hard for the system to identify correctly. It’s much like a child learning what a zebra is when it’s

never seen it before, they would just call it a striped horse. While, technically, that isn’t wrong,

we can’t say that it’s right. Especially if that could mean someone gets pinned for crimes done

by someone else that just so happens to look alike to the system.

A facial recognition system misidentifying someone innocent as a criminal has

happened before with a system used in the United States. A Michigan man was identified as a

shoplifter of nearly $5,000 dollars’ worth of watches and jewelry and that was only the first

mistake. The results of the AI search were said to be used only for investigation and not as
means of arrest. This would mean that the police could only use this information as a lead, a

clue, as to where to look for the perpetrator. It was expected for the police to then do further

searches for proof that the identified man has actually committed the crime, such as

eyewitness accounts, owning the clothes seen at the crime scene, and alibi’s he may have (Hill,

2021). The police had failed to do more than show a picture of the innocent man to the clerk

working during the crime.

The misidentification had led to nearly 30 hours in jail of an innocent man and trauma

to an entire family. The Michigan man had posted a video driving home from work on his

Instagram at the very time the crime had taken place, an alibi police failed to discover. The

most heart retching thing about this entire incident happened only a few hours in when,

during interrogation, the man asked officers “That ain’t me, do you think all black men look

alike?” (Hill, 2021).

This unfortunate string of events all sprouted from the integrated bias of the system

itself. The training materials consists of mostly white males, making the system exceptional at

identifying them, however, black men and women are more likely to be misidentified due to

this. Not only has photography failed those with darker skin, making older photos of them

poorly exposed, it also meant that accurate training material for these systems was either

scarce or purposely left out; making this all-powerful tool fail them before it can even be used

correctly.

Although, facial recognition has already been our lives for a while now, before it

became this all-seeing system. Instagram, Skype, and Snapchat had used the forefather of

facial recognition, face identification, to create face filters. The program could detect a face in

frame and overlay a special effect, but it didn’t need to know who that face belonged to.
Facebook and Google Images have been using facial recognition to identify and accurately

tag people in photos for easier searches and sharing. Even before that, CCTV cameras have

been keeping watch in some of the darkest corners to the busiest streets, not to identify

people, but to simply record. These pieces of the system have been in our lives for years

before facial recognition pulled them together and began to grow its database and training

material. We were comfortable with the separate pieces, but as one, it becomes this scary

monster that laws and regulations have yet to adapt to. The biggest strain is with criminal

justice using facial recognition, but there are other areas that facial recognition undeniable

helps more than hurts.

Health care has begun to use facial recognition for documentation of patients’ ongoing

health and even identify illnesses and genetic conditions. There are many genetic conditions

that affect the face, however, the changes can be very slight, making it hard to identify. Facial

recognition has come in clutch with given healthcare providers more confidence when

diagnosing hard to catch genetic conditions. Face2Gene is one of the systems being used to

identify those conditions using Facial Recognition and although it isn’t 100% accurate, it’s still

more accurate at predicting the conditions than the doctors alone. The system isn’t being

used to diagnose the conditions off the bat, instead it’s giving doctors a second opinion

before conducting hundreds of dollars’ worth of tests. While the system has proven itself in

multiple occasions, it too is affected by it’s bias training material and often drops in accuracy

for patients with darker skin. However, these training materials are expanding every time a

new face is uploaded. In the long run, the more the system is used, the more accurate it will

become. The only downside is that the program is meant to identify rare conditions, so

gaining more training material for identifying them is, well, rare (Dolgin, 2019).
Once a health issue has been identified, Facial Recognition is still useful elsewhere

within health care. The AI is used to identify patients before they go into extensive surgery,

help identify nurses and keep logs of patients care. Facial Recognition can also be used to

identify patients ongoing health by detecting “subtle, involuntary facial expressions [that]

could indicate if patients are feeling pain or even consciousness during intense surgeries”

(Grifantini, 2021).

Those that are visual impaired can also use facial recognition to technically ‘see’.

Partnered with another AI, facial recognition can tell you who is in the photo, what they are

doing, and where they might be in photos and videos on social media. Facebook has taken

the same system of tagging people to also verbally explain images to those that are visually

impaired. It has expanded accessibility on the internet for those that cannot use the same

visual-based user interface and allowed the visual impaired to connect more with the content

other’s wish to share with them (Lehrer-Stein, 2021). Of course, this also brought about privacy

concerns, which Facebook was able to acknowledge, allowing users on Facebook to opt out

of the auto tagging program and affectively steering the facial recognition from utilizing

photos on their personal accounts. With the proper consent, this tool can benefit many, seeing

or not, however Facebook is currently the only online social media platform that uses the AI

system for accessibility rather than using the default and often unhelpful ‘Alt- Text’ settings.

That being said, Facial Recognition isn’t as powerful as we play it off to be. The training

materials are mainly photos of a person in a controlled environment, like a photo studio or

mugshot. Identifying a moving face in a pixeled video is extremely inaccurate and only has

been getting better thanks to higher quality cameras, not the system itself. Most of the time

the system needs you to be still for a few seconds before it can identify you at all. Only about

three years ago, Chinese Facial Recognition was only about halfway automated, meaning that
the other half of the time it was humans doing the confirmation of identities. Facial

Recognitions biggest threat is it’s large spread network of surveillance tools not it’s actual AI

network. However, this gives us time to have these important discussions on what rules and

guidelines need to be placed for when it actually fills in those shoes. Many are calling for all

out bans, while others recognize it’s benefits and are asking for better regulations.

Overall, Facial Recognition is an amazing tool. It often amplifies the flaws that society

has alone and has been pushed into a generalized negative stigma. Facial Recognition has

helped us recognize racial, accessibility, and privacy issues, but the conversations about these

issues have been slow and inflexible. Currently, it’s not the tool or the guidelines that are

inherently bad, it’s the users. Our laws have been lagging behind technology for so long that

many topics before Facial Recognition’s own conception have resurfaced and brought

attention to foundational issues within the society hoping to utilize it. Its benefits are great and

the tool is strong, but without consent and ignoring it’s weaknesses, Facial Recognition will

begin to hurt more than help.


References

Dolgin, E. (2019). AI face-scanning app spots signs of rare genetic disorders. Nature.

https://doi.org/10.1038/d41586-019-00027-x

Grifantini, K. (2021). Detecting Faces, Saving Lives: How facial recognition software is

changing health care. Embs.org. https://www.embs.org/pulse/articles/detecting-faces-

saving-lives/

Hill, K. (2021). Wrongfully Accused by an Algorithm. The New York Times.

https://www.nytimes.com/2020/06/24/technology/facial-recognition-arrest.html

Lehrer-Stein, J. (2021). Opinion | What It’s Like to Use Facebook When You’re Blind

(Published 2020). The New York Times.

https://www.nytimes.com/2020/01/17/opinion/sunday/facebook-facial-recognition-

accessibility.html

Mozur, P. (2021). Inside China’s Dystopian Dreams: A.I., Shame and Lots of Cameras

(Published 2018). The New York Times.

https://www.nytimes.com/2018/07/08/business/china-surveillance-technology.html

Ng, A. (2020, August 11). How China uses facial recognition to control human behavior.

CNET; CNET. https://www.cnet.com/news/in-china-facial-recognition-public-shaming-

and-control-go-hand-in-hand/

You might also like