You are on page 1of 7

Even as we debate the 

ethics, facial analysis advances at an accelerating rate.

All Over Your Face Amazon, for example, is testing grocery stores that track users as they shop.
Such technology has the potential to make our lives safer, more convenient, and
better customized to our individual needs, but it can also entrap us behind literal
Advanced facial-recognition technology can deduce aspects of our personality as bars or those of social norms or paranoia. As the machines' learning advances,
well as our identity. Will this new fact of life change the way we act? step by step, we must make or accept tradeoffs, explicitly or implicitly. That's why
it's worth looking into those electronic eyes, to understand their applications and
their social risks and benefits.
By Matthew Hutson, published on January 2, 2018 - last reviewed on January 6,
2018
The End of Hiding

Users of dating sites delicately curate what they reveal online, hiding information
that they consider unbecoming or that unwanted suitors might use to pursue them
Illustration by John Gall
beyond the site. But a pseudonym doesn't deliver what it used to. To see how
easily a stranger can learn personal information about you, Carnegie Mellon
Last year, a Russian firm launched the website FindFace, University privacy researcher Alessandro Acquisti conducted an experiment. He
which matches submitted photos to profiles on the social and two collaborators first used a web browser to collect profile photos of about
networking site VK, a regional Facebook imitator. If a stranger 5,000 Match.com users in a North American city. They also collected the primary
photographs you in the street or spots your image on another photos of about 100,000 Facebook users in the same city. Using a commercially
site, and you're on VK, then FindFace can likely identify you available piece of software called PittPatt, they were able to match about one in
by name. Trolls immediately began using the site to out 10 Match faces to a Facebook face. Before the introduction of such algorithms,
actresses in adult videos, harassing them and their friends and shaming them on the task would have required 500 million comparisons by hand.
discussion boards with epithets like "burnt whore."
For the researchers' next act, they pulled in college students walking by their
Meanwhile, Moscow police use facial recognition on a network of 160,000 security building and took three photos of each of them. They asked the students how
cameras across the city, and China is using cameras with facial recognition to tag they'd feel if a stranger could photograph them and predict their interests and
jaywalkers. You can also use your face to pay at some KFCs in China, and it's Social Security numbers. On a scale from 1 (most comfortable) to 7 (most
required before toilet paper can be dispensed  at some public restrooms. In uncomfortable), the average ratings were about 5 and 6, respectively. The
Dubai, police wear Google Glass devices that identify the faces of people in front researchers then proceeded to do just that. They matched the students' photos to
of them. Here at home, the faces of half of all American adults are already in Facebook profiles and grabbed their real names, interests, and other information.
the government's facial-recognition system. It's becoming harder to go about your Then they used those data points and another algorithm to search online and
life in private, online or off, anywhere in the world. You don't need to be dredge up Social Security numbers. For about a quarter of the participants, they
a porn star or a crook to find that unnerving. were able to guess the first five digits—enough to run a brute force identity attack
for the remaining four—within a few attempts. The method could easily be
Now researchers are developing techniques that not only identify people by their improved with more photos or slightly better algorithms. Sample responses from
faces but also infer what's in their minds. Our expressions signal our emotions, the students: "very worrisome," "surprised and  shocked," "freaky ... makes me
and our facial structure can hint at our genetic makeup. We've always known that reassess what I should ever reveal on the internet." Just for fun, Acquisti's team
faces convey information to others, but now ever-present electronic eyes can coded up a demo augmented-reality iPhone app: Point the phone's camera at a
watch us with untiring attention and with the training to spot our most fleeting
micro-expressions.
stranger, and next to the person's head it displays his or her name, SSN, and Third, the algorithms have improved. Developing artificial neural networks, or
date and state of birth. neural nets, is the hottest area of machine learning right now. These software
models work somewhat similarly to the brain. "Neurons" each process little bits of
Acquisti relied only on primary profile photos, but people upload billions of other information, then pass them on to other layers of the net. At first the strength of
photos to Facebook every month, many of them tagged by name. A recent study the connections between neurons is random. But over time, as the network
found that by using albums, comments, information about where and when photos guesses correctly or incorrectly (nope, not a hot dog), it receives feedback and
were taken, friend networks, and the bodies and backgrounds displayed, even adjusts accordingly. Once it's trained, it's ready to be used in situations where the
people in untagged photos with their faces blurred could be identified. "People like answer is not known in advance.
to think that they're anonymous and invisible, despite posting lots and lots of
information about themselves all over the internet," says psychologist Nicholas ARTICLE CONTINUES AFTER ADVERTISEMENT
Rule of the University of Toronto, who studies social perception. "It all feels
private from your living room, but it's the digital equivalent of posting a billboard Neural nets can have millions of neurons arranged in dozens of layers for what is
on the side of a major highway." called deep learning. There are many ways to arrange the neurons, but one of the
most important architectures at the moment is a convolutional neural net, or
ARTICLE CONTINUES AFTER ADVERTISEMENT ConvNet. These algorithms rely on convolution, a mathematical operation that
allows them to recognize patterns even as they vary slightly, the way you can
Recent advances have made such unintentional broadcasting possible, primarily recognize a face no matter where it falls on your retina. 
in an area of artificial intelligence known as machine learning, in which computers
discover patterns in data by themselves. Landmarks in machine learning—self- Since 2012, ConvNets have been the standard tool for image recognition. For
driving cars, Go-playing computers, automatic language translation—have facial recognition, neural nets are sometimes used to translate an image like a
resulted from three main factors. First, computing power has steadily increased, Match.com photo into a manageable set of numbers representing facial features.
and new specialized chips tailored for machine learning can run algorithms Then another algorithm looks for a target image—say, a Facebook photo—with
exponentially faster and more efficiently. the most similar set of features. As computers, data sets, and algorithms keep
improving, so will their ability to recognize us. The longer they work, the more they
learn, and the more powerful and accurate they become. And they're just getting
started.
Illustration by John Gall
Will facial recognition change how we act? One possibility: People won't care.
Second, "big data" has gotten bigger; remember Unless we can see the cameras, see the people looking at our online profiles, and
those billions of Facebook photos. We're see how they're using our information, we may forget about it. And even if we do
surrounded by sensors collecting information about care about our privacy in theory, we simply might not be able to maintain it in
the world and feeding it into databases. This practice. "It would require a nearly superhuman effort for an individual to properly
information doesn't just open up our personal lives; manage their privacy," Acquisti says. We're fairly good at managing our privacy
it helps to train the computers, which need massive offline: If you're having a sensitive conversation at dinner and the waiter walks by,
numbers of examples to learn from. A child can see you lower your voice. Online, however, you can't see the waiter. It's informational
one hot dog and recognize other hot dogs for life, asymmetry, Acquisti says: We know little about what people know about us, who
but a computer needs to "see" thousands or millions. knows it, or what their intentions are. 
The interpersonal effects of facial recognition remain clouded as well. What In one study, when participants were asked to rate their concern that a stranger
happens when we can no longer separate our personas and prevent our social might discern their sexual orientation, half of those who rated it a 7 out of 7 had
worlds from mutual contamination? Or when someone meets you at a bar or already revealed their orientation on their Facebook profiles. People will also pay
clicks on your LinkedIn profile and can use your image to dredge up every other more to keep privacy than to acquire it, an example of the endowment effect.
iota of your web presence, including footage of you at a kink club or political rally? Even the temperature of a room can irrationally affect how much people will
Maybe we'll learn to forgive youthful indiscretion when photographic evidence of reveal. 
our entire lives is out there. Maybe we'll learn to see each other as more complete
people. Or maybe we'll become paranoid and stop trusting one another. Either If we wanted to be fully informed and calculating, we'd have to put the rest of our
way, we won't have the space and control to nurture new relationships lives on hold: By one rough estimate, reading the privacy policies of every web
organically. "Privacy offers the ability to modulate your degree of intimacywith site we visit would cost Americans $1 trillion a year in lost time.
another person," Acquisti says.
The effects of accepting facial identification are both nefarious and salutary, often
ARTICLE CONTINUES AFTER ADVERTISEMENT in combination. Surrendering anonymity denies us agency by holding us to our
pasts and to who we are elsewhere. When strangers can call up your biography,
We have some data on how people change when they feel watched. Studies in warts, laurels and all, you can't start fresh each time you walk into a room or meet
Sweden, England, and the United States show that security cameras moderately someone new. On the flip side, you can more easily avoid people with bad
reduce crime in their immediate vicinity. After Edward Snowden revealed many reputations; serial con artists will need to invest in a fresh selection of fake
federal surveillance practices, traffic to Wikipedia pages for topics such as moustaches. We'll also enjoy a range of new conveniences and security
"terrorism" and "dirty bomb" dropped. advantages like walletless checkout and terrorist identification in crowds.
Unfortunately, at this point, we can't know whether those benefits will outweigh
A study in Helsinki took tracking to an intimate extreme: In each of 12 participating the costs. "We are all part of a gigantic social experiment," Acquisti says.
households, researchers installed microphones and three or four cameras, and
also monitored wireless traffic and computer and cellphone activity (keypresses, What's Behind Our Faces
screen shots). The intrusion lasted six months. In surveys and interviews, the
subjects reported annoyance, anxiety, and occasional anger. As for sharing their AI can not only identify us by our faces but also read the emotions on them. Our
video data, they said they'd be least comfortable with the authorities seeing it, faces reveal more than just our biographies—who we are, what we've done,
even if they hadn't broken any laws, followed by public media, which could spin it where we've been. They also reveal what's inside our heads. Facial expressions
into "commercial drama or something," and then friends and acquaintances. But evolved to signal our mental state to others. Communication can occur
on average, they adapted to the tech over time. Half said they assumed their strategically, as when we smile politely at a coworker's joke, or subconsciously, as
internet use was already being monitored post-9/11. And while some changed when we display tells at the poker table. People are pretty good at reading
their routines ("I kind of cannot have sex in the kitchen because of the camera"), expressions already, but machines open new opportunities. They can be more
others didn't ("After I realized that I'd already walked naked to the kitchen a couple accurate, they don't get tired or distracted, and they can watch us when no one
of times, my threshold...got lower"). else is around.

Illustration by John Gall One opportunity this opens up is helping people who aren't naturals at face
reading. Dennis Wall, a biomedical data scientist at Stanford, has given Google's
The Finnish group's varied responses should come Glass to children with autism. They wear frames with a built-in camera connected
as no surprise. Acquisti and colleagues have written to software that detects faces and categorizes their emotions. The devices can
about our inconsistencies when considering privacy. then display words, colors, or emoticons on a little screen attached to the glasses,
which the child can view by looking up. The software can run constantly, or the social fictions. "There will be a sector of humanity that will want stuff like that,"
children can play training games, such as one in which they try to guess Wall says, "but I think a majority will prefer just to sit down and have a
someone's emotion. Parents can review recordings with a child and explain tricky conversation with somebody the old-school way." 
social interactions. Children can't wear the device in the classroom, but teachers
report that the training has improved engagement and eye contact. Wall says Illustration by John Gall
similar applications might help people with PTSD or depression, who, research
shows, are biased to miss smiling.  Face-reading algorithms generally fall into one of two
types. There are machine-learning algorithms (including
Ned Sahin, a neuroscientist who has developed Glass apps for autistic children, neural networks) trained to translate an image into an
says anyone could benefit from such assistance. "I make a joke any time I talk emotional label. This process is relatively simple but
about it: Good thing we're doing this for people on the spectrum, because they deals best with stereotypical facial configurations,
need it and we don't. We've got this all dialed in," he says, emphasizing the irony. which can be rare. Second, there are methods that use
"And each of you knows exactly what your wife or husband is thinking at any a machine-learning algorithm (again including neural
time." networks, or one called a support vector machine) that
detect in an image a set of active "action units," or
There are indeed some situations in which face-reading tech performs better than facial movements linked to underlying muscle contractions. Another algorithm
neurotypical people. In one study, individuals were recorded doing two tasks: then translates the action units into an emotional expression. This method is more
watching a video of a baby laughing, which elicited smiles of delight, and filling out flexible, but analyzing action units can be tricky. Once you add variations in
a frustrating web form, which elicited natural expressions of frustration, closely lighting, head pose, and personal idiosyncrasy, accuracy drops. 
resembling smiles. When other participants viewed the recordings to categorize
the smiles as delighted or frustrated, they performed no better than chance. A Automatic face reading has wide applicability. Couples might use it to better
machine-learning algorithm, however, got them all right. In the real world, people understand each other—or to understand themselves and what signals they're
would have contextual clues beyond facial expressions. "Coding facial really displaying in a conversation. Public speakers might use it to help read their
movements in the absence of context will not reveal how someone feels or what audience during online or offline seminars or to practice their own body
they think most of the time," says Lisa Feldman Barrett, a psychologist and language. Teams might use it to monitor and improve group dynamics. Treaty
neuroscientist at Northeastern University. negotiators or criminal investigators could use it for peace and security (or for
manipulation).
In another experiment, participants watched videos of people holding an arm in
ice water or holding an arm in warm water and pretending to look anguished. In a recent book chapter, computer scientists Brais Martinez and Michel Valstar of
Subjects' scores at distinguishing real from faked pain expressions remained the University of Nottingham outlined face reading's potential benefits for
below 60 percent even after training. A machine-learning algorithm scored around behavioral medicine in the diagnosis and treatment of such disorders as
85 percent. depression, anxiety, autism, and schizophrenia, as well as in
pain management (evaluating injuries and tracking rehab). Louis-Philippe
These studies raise the possibility of AI lie detectors—possibly deployed on Morency, a computer scientist at Carnegie Mellon University, has used video
something like Google Glass. What happens when our polite smiles stop analysis to find that depressed people don't smile less than other people but that
working? When white lies become transparent? When social graces lose their their smiles are different—shorter and less intense. He's also found that
lubricating power? Even if we have the technology to create such a dystopia, we depression makes men frown more and women frown less. He recently reported
may decide not to use it. After all, if someone says he likes your haircut, how hard that using machine learning to analyze conversations with depressed people can
do you currently try to test the comment's veracity? We prefer to maintain certain predict suicidality. Algorithms can be more objective than people, and they can be
deployed when doctors aren't around, monitoring people as they live their lives. Recently, a paper made a big splash by demonstrating that machine learning
They can also track subtle changes over time. Morency hopes that by giving could guess sexual orientation from dating-site headshots much better than
doctors more objective, consistent measures of internal states to help them in chance. The algorithm's AUC—a statistical measure that accounts for both false
their assessments, he can create "the blood test of mental health." positives and false negatives, where 0.5 is chance and 1.0 is perfect—was 0.81
for men and 0.71 for women. Human guessers scored only 0.61 and 0.51,
Affectiva, a company spun out of MIT's Media Lab, has collected data on six respectively. In other words, if the computer selected the 10 men most likely to be
million faces from 87 countries and put facial analysis to work for dozens of gay from a group of 1,000 photos, it would be right about 9 of them. 
clients. Uses include making a cute robot more responsive to learners during
language lessons, making a giant light display respond to crowds, and analyzing
legal depositions. The company is also working on automotive solutions that both
monitor drivers' alertness to make sure they're always ready to take back control CONNECTING THE DOTS: Much facial-
in semi-autonomous vehicles and measure mood for better customization of the recognition technology identifies key landmarks
driving experience. on a subjects's face. Above, a still from a Stanford
study on sexual-orientation detection. Photo
Facial analysis is frequently used to measure audience response to ads, because courtesy of Michal Kosinski
a good deal of the money in tech is in advertising. It's also where much of the
potential for abuselies. In one study of supermarket shoppers, some participants Michal Kosinski, a psychologist at Stanford
expressed discomfort with the potential for micro-expression monitoring. University, and his collaborator wrote the paper as
"Understanding how you really feel about this product even though you might not a warning of what's possible. They used an off-the-shelf neural network and other
know it yourself... that's a little spooky," one participant said. "It's like mining your standard algorithms—ones available to any government, including those in
thoughts more than just your buying habits." countries where homosexuality is a crime punishable by death. Critics have
argued that the algorithms might be relying on subtle differences in posing or
Obviously, we need some extensive discussions about consent for facial analysis. grooming based on sexuality, thus reducing the study's validity, but even if that's
Which norms and laws are necessary to maintain a sense of inner privacy? Facial the case, the additional facial cues can be used in the real world. What's more,
analysis clearly has great value for users, but to the extent that we don't the method doesn't need to be perfect to have an impact: It might simply be used
understand or think about our privacy, informed consent may be an illusion, and as a prescreening device to narrow the range of people to investigate.  
people will increasingly come to know us much better than we may be
comfortable with. One danger in automating judgment about traits or inclinations is the risk of
encoding biases while offering the illusion of objectivity. A recent paper by
Typecasting, With Accuracy Chinese researchers used machine learning (including a standard convolutional
neural net) to assess "criminality" based on headshots. But their basis for
In 2014, an Israeli company named Faception launched, with the promise that its measuring criminality was not the committing of a crime, or even traits such as
AI could classify several character types from faces—including the Bingo Player, aggression or impulsiveness; it was the existence of a criminal conviction. And
the Academic Researcher, and the Pedophile. They don't reveal much about their one's path through the justice system depends on subjective judgment at every
clients, but claim they've done homeland-security work, presumably keeping step, including biases based on appearance. Maybe someone looks mean. That
citizens safe from bingo extremists. Questionable marketing pitches aside, the person is more likely than someone else to be caught and convicted for a similar
evidence suggests that facial structure really does reveal some internal traits. crime. The algorithm then learns that mean-looking people have more
"criminality." It uses that to catch and convict more mean-looking people. The
cycle repeats. It's easy to see race and class biases becoming embedded and an app called Face2Gene, which can evaluate the probabilities of 2,000
amplified. disorders. It helps to distinguish between different disorders when faces look
somewhat abnormal and can suggest diagnoses even when a face shows no
Kosinski is hopeful, however, that AI can actually minimize inaccurate profiling. obvious signs to a physician's eye. Face2Gene has been trained mostly on white
Even if absolute objectivity is an illusion, a computer might rely on relatively more faces, so Maximilian Muenke, a geneticist at the National Human Genome
objective signals than humans. He sees another possible benefit to automated Research Institute, is developing an app suitable for a variety of races, since
profiling: increased tolerance. He would not out anyone without their consent, but many poorer countries don't have the resources to manually screen children. He
imagines that if everyone were outed, homosexuality might become less taboo. notes that Nigeria has a population exceeding 180 million but not a single clinical
"Do you really think," he asks, "that if people in Saudi Arabia realized that 7 geneticist. While such technology could be used to diagnose people against their
percent of their neighbors, cousins, uncles,  people in the royal family are gay, will, "the benefits outweigh the possible negatives," he says.
they would burn them all at the stake?"
As with all technology, there are tradeoffs. Our faces are rich with information,
Other traits appear in the face, too, whether through genetics, environment, or and we won't know what will happen when we harvest it all until we do. Judging
some combination. Nicholas Rule, the Toronto psychologist who studies social from past advances—cars, televisions, the internet—many of our worries will turn
perception, recently co-authored a book chapter surveying the field. Based on out to be for nothing, while other, unforeseen, social dilemmas will surely crop up.
faces, he found, people can predict personality, political orientation, Jewishness,
Mormonism, and business and political success. Predicting professional success, Our most public-facing body part is simultaneously our most intimate. We've
though, is a bit like predicting criminal convictions: It should not be mistaken for evolved to share it with people in our close vicinity—and to have equal access to
predicting a truly inherent trait. theirs. Someday soon that most basic social compact may be disrupted.

Humans who make these predictions score at or just above chance. At the most Matthew Hutson is a science and technology writer and the author of  The 7 Laws
recent Psychology of Technology conference, Kosinski's graduate student Poruz of Magical Thinking.(link is external)
Khambatta revealed that AI can do better. But it might never be great. Even if it is,
it might not cause as much disruption as we fear, because we already have better Cracking Facial Recognition in Our Own Neural Networks
ways to identify sexual preference, political ideology, and the rest: what people
say, how they move, and what they wear.  While some researchers try to engineer ever-better facial recognition technology,
others are trying to reverse-engineer the facial recognition circuitry inside our own
"Past behavior is a better predictor of future behavior than how you look in the heads. These groups may soon be able to meet in the middle, advancing
moment," says Alexander Todorov, a psychologist at Princeton University and the both neuroscience and computer science.
author of Face Value. If possible employers want to know if they're hiring a future
terrorist or a bingo player, they're better off looking at your Facebook feed than A part of the brain called the inferotemporal cortex, or IT, plays a key role in facial
your profile picture. Kosinski has done work showing that, based on Facebook recognition, but its coding scheme has been a matter of debate for years. In
likes, a computer can predict a wide variety of characteristics, including recent research, Le Chang and Doris Tsao, neuroscientists at the California
personality, intelligence, religion, and drug use. In fact, computers judged Institute of Technology, appear to have broken that code. They started by using a
personality better than people's own spouses could. In many ways, they're getting computer to generate 2,000 photo-realistic faces that differed from each other
to know us better than we know ourselves. based on 50 variables, or independent dimensions. The researchers then
recorded electrical activity in the IT cells of two monkeys—whose visual
Aside from character and demographic traits, AI can also read genetic and processing closely resembles our own—as the monkeys looked at the faces.
developmental disorders in our faces. The majority of clinical geneticists now use They found that each cell was tuned to only one of the 50 dimensions. Chang and
Tsao noted how remarkable it is that this area of the brain performs such an
abstract calculation. Once they had the code, they could read a monkey's mind
and reconstruct whatever face it was seeing.

The IT, however, represents only a face's shape and appearance, not its identity.
A recent paper by neuroscientists Sofia Landi and Winrich Freiwald of The
Rockefeller University explored this next step of processing. They recorded
cortical activity in four monkeys as they looked at photos of faces and objects.
The images of personally familiar faces activated two new areas of the brain—the
perirhinal cortex and the temporal pole, both of which are important for memory.
These areas also responded differently to familiar faces than did the others. As a
face gradually came into focus, instead of slowly becoming more active, the areas
suddenly jumped to attention in an aha! moment when the face revealed itself as
connected to everything the monkey knew about its owner.

These findings reveal how our brains make sense of the visual world, and the
more we learn about the brain's elusive codes, the better we can implement them
in silicon.

Submit your response to this story to letters@psychologytoday.com(link sends e-


mail). If you would like us to consider your letter for publication, please include
your name, city, and state. Letters may be edited for length and clarity. 

Pick up a copy of Psychology Today on newsstands now or subscribe to read the


the rest of the latest issue.

Facebook image: Alin Lyre/Shutterstock

You might also like