You are on page 1of 2

SPECIA L SECTION A RT I F I C I A L I N T E L L I G E N C E

Downloaded from www.sciencemag.org on July 16, 2015


Ellie, a virtual health agent,
monitors patients’ expressions,
gestures, and voice.

The synthetic therapist


Some people prefer to bare their souls to computers
rather than to fellow humans
By John Bohannon

P
eople have always noticed Yrsa like I was losing control.” Then she met Fjola realized, “I didn’t want people to see me.”
Sverrisdottir. First it was ballet, Helgadottir, one of the few other Icelanders Then the program assumed the role of
which she performed intensively in town. Helgadottir, a clinical psychology full-fledged therapist, guiding her through
while growing up in Iceland. Then researcher at Oxford, had created a com- a regimen of real-world exercises for tak-
it was science, which she excelled puter program to help people identify and ing control. It sounds like a typical success
at and which brought her to the manage psychological problems on their story for clinical psychology. But no human
stage at conferences. And starting own. Sverrisdottir decided to give it a try. psychologist was involved.
in 2010, when she moved to the The program, based on a technique CBTpsych is far from the only computer-
PHOTO: USC INSTITUTE FOR CREATIVE TECHNOLOGIES

University of Oxford in the United called cognitive behavioral therapy (CBT) ized psychotherapy tool available, nor the
Kingdom to study the neurophysiology of the and dubbed CBTpsych, begins with sev- most sophisticated. Ellie, a system built at
heart, it was her appearance. With her Nor- eral days of interactive questioning. “It the University of Southern California (USC)
dic features framed by radiant blonde hair, “I was exhausting,” Sverrisdottir says. The in Los Angeles, uses artificial intelligence
just stand out here,” she says. “I can’t help it.” interrogation started easily enough with (AI) and virtual reality to break down barri-
After she arrived in the United Kingdom, basic personal details, but then began to ers between computers and humans. Origi-
she found that she no longer enjoyed the probe more deeply. Months of back and nally funded by the U.S. military, its focus
attention. She began to feel uncomfortable forth followed as the program forced her is on diagnosing and treating psychological
in crowds. Her relationships suffered. There to examine her anxieties and to identify trauma. Because patients interact with a
had been some obvious stressors, such as distressing thoughts. CBTpsych diagnosed digital system, the project is generating a
the deaths of both of her parents. But the her as having social anxiety, and the in- rare trove of data about psychotherapy itself.
unease didn’t let up. By 2012, she says, “I felt sight rang true. Deep down, Sverrisdottir The aim, says Albert “Skip” Rizzo, the USC

250 17 JULY 2015 • VOL 349 ISSUE 6245 sciencemag.org SCIENCE

Published by AAAS
psychologist who leads the effort, is nothing Conversational “chatbots” such as ELIZA before the spell breaks, which limits the sys-
short of “dragging clinical psychology kick- are still viewed as a parlor trick by most com- tem’s usefulness for diagnosis and treatment
ing and screaming into the 21st century.” puter scientists (Science, 9 January, p. 116). of most psychological problems. Without
A 19 June editorial in The New York Times But the chatbots are finding a new niche in sophisticated natural language processing
deemed computerized psychotherapy “ef- clinical psychology. Their success may hinge and semantic knowledge, Ellie will never
fective against an astonishing variety of on the very thing that AI researchers eschew: fool people into believing that they are talk-
disorders.” The penetration of the Internet the ability of an unintelligent computer to ing to a human. But that’s okay, Rizzo says:
into far-flung communities could also bring trick people into believing that they are talk- Becoming too humanlike might backfire.
mental health treatment to vast numbers of ing to an intelligent, empathetic person. One counterintuitive finding from Rizzo’s lab
people who otherwise have no access. came from telling some patients that Ellie is
But whether clinical psychologists will THAT ISN’T EASY, as Rizzo is keenly aware. a puppet controlled by a human while telling
accept AI into their practice is uncertain. What most often breaks the spell for a pa- others she is fully autonomous. The patients
Nor is it clear that the tools of AI can carry tient conversing with Ellie isn’t the content told there was a puppeteer were less engaged
computerized psychotherapy beyond its so of the conversation, because the computer and less willing to open up during therapy.
far limited capacity, says Selmer Bringsjord, hews closely to a script that Rizzo’s team That’s no surprise to AI researchers
a cognitive scientist and AI researcher at based on traditional clinical therapy ses- like Winograd. “This goes right back to
Rensselaer Polytechnic Institute in Troy, New sions. “The problem is entrainment,” he ELIZA,” he says. “If you don’t feel judged,
York. “It is incredibly ambitious.” says, referring to the way that humans sub- you open up.”
consciously track and mirror each other’s Ethical and privacy issues may loom if AI
ALL OF TODAY’S VIRTUAL psychologists emotions during a conversation. therapy goes mainstream. Winograd worries
trace their origins to ELIZA, a computer pro- For example, a patient might say to that online services may not be forthcoming
gram created half a century ago. Named after Ellie, “Today was not the best day,” but the about whether there is a human in the loop.
the young woman in Pygmalion who rapidly voice recognition software misses the “There is a place for deceiving people for
acquires sophisticated language, ELIZA was “not.” So Ellie smiles and exclaims, “That’s their own good, such as using placebos in
nothing more than a few thousand lines of great!” For an AI system striving to bond medicine,” he says. But when it comes to
code written by Joseph Weizenbaum and with a human patient and earn trust, Rizzo AI psychology, “you have to make it clear to
other computer scientists at the Massachu- says, “that’s a disaster.” people that they are talking to a machine
setts Institute of Technology and not a human.”
(MIT) in the early 1960s to study If patients readily open up
human-computer interaction. to a machine, will clinicians be
ELIZA followed rules that de- The goal is “dragging clinical psychology needed at all? Rizzo is adamant
termined how to respond during that a human must always be
a dialogue. The most convinc-
kicking and screaming into the 21st century.” involved because machines can-
ing results came from a rule set Albert “Skip” Rizzo, University of Southern California not genuinely empathize with
called DOCTOR that simulated a patients. And Ellie, he points
psychotherapist: By turning patients’ state- To improve entrainment, a camera tracks a out, has a long way to go before being ready
ments around as questions, the program patient’s psychological signals: facial expres- for prime time: The program does not yet
coaxed them to do most of the talking. For sion, posture, hand movement, and voice have the ability to learn from individual pa-
instance, in response to a patient saying, “I dynamics. Ellie crunches those data in an at- tients. Rizzo envisions AI systems as a way
feel helpless,” the computer might respond, tempt to gauge emotional state. to gather baseline data, providing psycholo-
“Why do you think you feel that way?” (You The patterns can be subtle, says Louis- gists with the equivalent of a standard bat-
can talk to ELIZA yourself at http://psych.ful- Philippe Morency, a computer scientist at tery of blood tests. “The goal isn’t to replace
lerton.edu/mbirnbaum/psych101/Eliza.htm.) USC who has led the development of the AI people,” he says, “but to create tools for hu-
People engaged readily with ELIZA, that underlies Ellie. For instance, he says, man caregivers.”
perhaps more for its novelty than its con- a person’s voice may shift “from breathy Helgadottir has a bolder vision. Although
versational skills, but AI researchers were to tense.” The team devised algorithms to computers are not going to replace thera-
unimpressed. “The idea that you could make match patterns to a likely emotional state. pists anytime soon, she says, “I do believe
a convincing AI system that didn’t really It’s imperfect, he says, but “our experiments that in some circumstances computerized
have any intelligence was seen as cheating,” showed strong correlation with [a patient’s] therapy can be successful with no human
says Terry Winograd, a computer scientist at psychological distress level.” intervention … in many ways people are not
Stanford University in Palo Alto, California, Other patterns unfold over multiple ses- well suited to be therapists.” A computer
who was a Ph.D. student down the hall from sions. For instance, the team’s work with may be more probing and objective.
Weizenbaum. This was a wildly optimistic U.S. veterans suffering from post-traumatic Sverrisdottir’s experience suggests that
time for the field, with many researchers an- stress disorder (PTSD) revealed that “smile CBTpsych, at least, can make a difference.
ticipating computers with human-level gen- dynamics” are a strong predictor of depres- Under the program’s tutelage, she says,
eral intelligence right around the corner. sion. The pattern is so subtle that it took “very slowly, I started to analyze myself
But work on artificial general intelligence a computer to detect it: Smiling frequency when I’m amongst other people.” She identi-
didn’t pan out, and funding and interest remained the same in depressed patients, fied a pattern of “negative thoughts about
dried up in what has come to be known as on average, but the duration and intensity people judging me.”
the “AI winter.” It wasn’t until the turn of the of their smiles was reduced. She might have got there with a human
new millennium that mainstream interest in Even if Ellie were to achieve perfect en- therapist, she says. But in the years since she
AI resurged, driven by advances in “narrow trainment, Rizzo says, it “is really just an en- first started talking to a computer about the
AI,” focusing on specific problems such as hanced ELIZA.” The AI under the hood can trouble swirling in her mind, Sverrisdotter
voice recognition and machine vision. only sustain about a 20-minute conversation says, “I have been able to change it.” ■

SCIENCE sciencemag.org 17 JULY 2015 • VOL 349 ISSUE 6245 251


Published by AAAS

You might also like