You are on page 1of 13

Ep.

15: At war with facial recognition: Clearview AI in Ukraine

(NPR Morning Edition theme)

NPR HOST: Iraq’s authorities have ordered life in Baghdad to come to a halt…

DINA TEMPLE-RASTON: When I was at NPR, one of my assignments was to cover the war in Iraq. I
was there in 2008.

NPR HOST: As NPR’s Dina Temple-Raston reports from Baghdad, there is rampant intimidation…

TEMPLE-RASTON: And back then if you wanted to get a pass to go into that little city within a city
behind the blast walls, Baghdad’s green zone…

ARCHIVAL TEMPLE-RASTON NPR: …the name is ironic given that it is monochromatic…

TEMPLE-RASTON: You had to give the U.S. Army your biometrics: Iris scans, fingerprints,
photographs. The whole nine yards.

And what I remember about it all is that I didn't have a choice. In order to do my job, in order to get
into that part of Baghdad and interview members of the interim government, I had to give up
information about myself.

JACKIE SINGH: Did they enroll you using a device, a handheld?

TEMPLE-RASTON: Yes.

JACKIE SINGH: Ah, yes. That was most likely…

TEMPLE-RASTON: Jackie Singh served in Iraq. She was there around the same time I was.

SINGH: When you are in a fog of war, being able to distinguish between friend or foe seems like the
most important thing you can do.

TEMPLE-RASTON: Who are those officials at the gate now? Is this the same local who was working
here yesterday?

At that time there were suicide bombings almost every day, so I can see that people needed to know
who was who.
SINGH: Having biometric data would help us make sense of that.

TEMPLE-RASTON: But I couldn’t help thinking, all this information they were gathering about me
would never go away. They would have some essence of me in some database, well, forever.

(THEME MUSIC)

TEMPLE-RASTON: Fast forward fourteen years and surveillance technology is on a whole different
level. Now there are systems that can take a single frame from surveillance camera footage,
isolate the face and, in very short order, tell us exactly who it is.

(MUSIC)

TEMPLE-RASTON: I’m Dina Temple-Raston, and this is Click Here, a podcast about all things cyber
and intelligence.

Today, we talk to people in Ukraine using a facial recognition program called Clearview AI, have a
rare interview with the CEO of the company, and ask a bigger question: Is introducing this powerful
technology into a war zone a good idea?

TEMPLE-RASTON: War zones are testing grounds for tech all the time, aren’t they?

SINGH: I agree.

TEMPLE-RASTON: So what makes this different?

Stay with us.

BREAK

TEMPLE-RASTON: Clearview AI makes a groundbreaking facial recognition app. It allows users to


take a picture of someone, upload it to its database, and match it to a publicly identified photo. In
just seconds, it puts a name to a face.

And now Clearview AI is in Ukraine.

HOAN TON-THAT: I would see photos of captured Russian soldiers, and I realized that that kind of
quality of photo, um, our facial recognition technology could be helpful.
TEMPLE-RASTON: Hoan Ton-That Juan Tawn-tat is the founder and chief executive officer of
CLEARVIEW and he gave us a rare interview.

TON-THAT: So I reached out to a lot of people on our advisory board to ask them, and, you know,
investors, whoever…Do you know anyone in the Ukrainian government?

TEMPLE-RASTON: Turns out a member of their advisory board was leaving for Ukraine the next day,
and Ton-That gave him a letter to take with him. It wasn’t so much a pitch letter as a list of ways
Ton-That thought Clearview AI might be able to help the war effort. Things like identifying suspects
who were caught by surveillance cameras committing crimes or identifying spies at the border.

TON-THAT: We didn't hear back for a while. And then we got an email in our inbox with a few
questions about the facial recognition technology.

TEMPLE-RASTON: A Ukrainian defense official was interested, he said. And Clearview ended up
offering its subscription service to Ukrainian investigators — for free.

Which has been controversial because Clearview AI has been under fire for the way it built its
database in the first place: It has scraped some 20 billion images from social networks and other
online sources all over the world, and it did that without asking the websites or the people in the
photographs for consent. That’s why big tech companies like Google and Facebook have demanded
Clearview cease and desist, and why individual states — and even countries like Britain and Italy —
have filed lawsuits to make them stop.

Clearview argues the pictures are already public so the data collection is protected by the First
Amendment, which is an argument that might wash in the U.S. but doesn’t account for all the
photos the company has collected elsewhere.

And that’s just the beginning. Clearview hopes to have ingested 100 billion images within a year. If
you do the math, that’s 14 photographs for every person on earth. And now six agencies and some
400 Ukrainian investigators have log-ins for the service.

TON-THAT: We thought that it could be used to identify if someone is who they say they are. Uh,
along with potentially reuniting families or with refugee situations.

TEMPLE-RASTON: And Ukraine has found uses for Clearview that have surprised even Ton That —
like identifying bodies on the battlefield.
TON-THAT: The deceased soldiers part of, you know, we thought it was possible. But to actually see it
in reality, uh, some of the photos were very gruesome, but it is something that changes the
dynamics of war.

(MUSIC)

TEMPLE-RASTON: Have you ever used facial recognition software before?

LEONID TYMCHENKO: Yes.

TEMPLE-RASTON: And is it fast?

TYMCHENKO: Yes, it’s just seconds. It depends on the quality of the picture…

TEMPLE-RASTON: Ukraine has a Department of National Police. And the head of their IT is a guy
named Leonid Tymchenko.

TYMCHENKO: …M-C-H-E-N-K-O. Tymchenko.

TEMPLE-RASTON: He says Clearview AI is helping his team in lots of ways. In addition to helping
them put names to Russians who might have been looting or killing civilians, he’s been using it to
identify the war dead — which is harder than it sounds.

While sometimes the dead have papers or dog tags or some kind of identification, in this conflict a
lot of them do not. Their uniforms are curiously without rank, or insignia, or name. Tymchenko says
Clearview has been helping them with that since March.

TYMCHENKO: We see very good results in recognizing even dead bodies, which is actually very good
because as you can imagine, speaking about dead soldiers it's not always when, when they have,
like, pretty face…

TEMPLE-RASTON: You mean that part of the face may be damaged somehow?

TYMCHENKO: Yes, exactly.

TEMPLE-RASTON: In a war that has been awash in misinformation, Tymchenko says Clearview is
helping ground the conflict in truth. In some cases, brutal truths.

TYMCHENKO: I have to ask you if you are, uh, if it is okay to show such kind of damaged faces,
maybe.
TEMPLE-RASTON: Yes, nobody will see it. This will just be us.

TYMCHENKO: What about you?

TEMPLE-RASTON: Uh, you can show me a damaged face. I‘ve been a war correspondent, so I’m OK.

TEMPLE-RASTON: On his screen was a photograph of a Russian soldier. He was dusty and twisted,
wedged in a gulley as if killed in the act of ducking for cover. There was no obvious identification on
his uniform.

TYMCHENKO: We took a photo of his face from this photo and put it to the Clearview AI tool and
there were actually no direct results. But there were 19 additional results.

TEMPLE-RASTON: The app surfaced photographs it thought could be a match. Think of it as the AI
saying, Here are some options the algorithm isn’t sure about. I need a human to take a look. A human like
Tymchenko.

TYMCHENKO: So, uh, search another 19 results. I managed to identify the person with his account at
the social network of Odnoklassniki – it's like classmates translated into English – and search his
photos. I managed to get information that he is a military person. He is a soldier of a Russian army
and his name is Nikoli Kirkul.

TEMPLE-RASTON: He looks very different in the picture because he doesn't have a mustache.

TYMCHENKO: Uh, uh, if yes, there is no mustache. And maybe these pictures, they are from, let's say
two, three, maybe even more years. And, uh, so this is this person and you can compare if you will
take two pictures together…

TEMPLE-RASTON: Tymchenko showed me the pictures side by side.

TYMCHENKO: So you see. But, actually, it's the same person.

TEMPLE-RASTON: They did look like the same person. A fresh, cleanly shaven young man in one
picture, the twisted figure in the other.

Having an accounting of these people is important for all kinds of reasons – from getting accurate
casualty figures to gathering evidence of war crimes. Tymchenko says his unit has identified about
5,000 people since March.
People like Russian soldiers who looted Ukrainian homes. A surveillance camera at the post office
caught them shipping washing machines and other big ticket items back to Russia. Now, other
branches of Ukrainian law enforcement are clamoring for their own accounts.

TYMCHENKO: It's also cyber police. It's also the department of crime, criminal analysis and
department of criminal service. Our investigators, they also use this technology. They have accounts
flowing from Clearview, and we hope we will be able to also provide free accounts to our metro
police.

TEMPLE-RASTON: And then there’s this thing I never thought about until he mentioned it. While
most facial recognition systems rely on recognizing people with their eyes open, Clearview can
identify people even if their eyes are closed.

TYMCHENKO: For example, it's impossible to unlock a cell phone if your eyes are closed. Clearview is
able to identify people [with] real damaged faces, people who have their eyes closed.

TEMPLE-RASTON: Which just gives you an idea of how far facial recognition technology has come.

(MUSIC)

TEMPLE-RASTON: Just about a decade ago top-of-the-line facial recognition software's scores for
accuracy were like 70 percent. And it freaked us out.

NEWS: Science fiction almost got it right. Facial recognition technology has become part of daily life.

LOCAL FOX AFFILIATE: Here’s the deal: the feature will use a facial recognition program, which will
scan pictures and videos posted to Facebook…

NEWS: To identify the people in that crazy picture from last weekend’s beach party. It’s all
happening, of course, on Facebook…

TEMPLE-RASTON: Facebook was an early adopter with photo tagging back in 2011, but the system
was rudimentary. Essentially: Which one of these people is your friend? Police started using it to sort
through mugshots a year or two later, but accuracy and racial bias were huge problems.

Then in 2018 and 2019, artificial intelligence and something called neural networks just changed
everything. The easiest way to understand neural networks is to think of them as a series of
algorithms that recognize underlying relationships between things — kind of like the human brain
does.
TON-THAT: I’d say now in 2022, all the top-performing algorithms ranked by NIST have almost no
demographic bias and are much more accurate than the human eye. They can pick, you know, a
photo out of 12 million. That’s what NIST is testing.

TEMPLE-RASTON: NIST, that’s the National Institute of Standards and Technology. It’s part of the
Department of Commerce and evaluates various technologies, provides an independent assessment
and then sets standards. And NIST seems to be saying that facial recognition wasn’t the stuff of
science fiction anymore. It’s arrived.

(MUSIC)

When we come back, when new technologies are misused…

SINGH: It comes down to intentions being divorced from second and third order effects, right?
You can collect data, you can label people with the best of intentions. But….

TEMPLE-RASTON: Stay with us.

BREAK

TEMPLE-RASTON: So let's, let's start with an easy one. Could you introduce yourself to us please?

SINGH: Oh, wow. Okay. Yeah. Okay.

TEMPLE-RASTON: That was supposed to be an easy one…

SINGH: (Laughs) No, let me start over. And so what I did… (fade under)

TEMPLE-RASTON: Jackie Singh is a former Iraq veteran who joined the military when she was just 17.

SINGH: So I signed my name on the dotted line and because I was 17, my mom had to sign for me as
well.

TEMPLE-RASTON: She was part of the early 2000s underground hacking scene, hanging out with
people from a hacker collective you may have heard of: 2600.
SINGH: They put out a magazine once a quarter and we would have meetings once a month where
kids and adults from different walks of life would meet and chat about all the latest ways to maybe
get some free items from a vending machine or get free calls from a payphone.

TEMPLE-RASTON: Remember gray hat Hacker Adrian Lamo? He was one of 2600’s early leaders.

SINGH: I have no formal education. Um, I dropped out of high school the day after I turned 16. I was
spending a lot of time in the hacker halfway house in Brooklyn. (laughs)

TEMPLE-RASTON: Then, when she was in Iraq — first as a member of the military and then later as a
military contractor — she saw military surveillance firsthand.

SINGH: Well, uh, surveillance operations in a war zone are ubiquitous. We're talking about drones,
aerostat blimps, which were these very large suspended blimps up in the air that were tethered to
the ground. These were essentially seen by me as very positive things.

TEMPLE-RASTON: When you were in Iraq did you feel this technology was being used in the right
way?

SINGH: I never had any reservations.

TEMPLE-RASTON: These days, Jackie Singh has come full circle. Now she works for a nonprofit
advocating against facial recognition and mass surveillance. And she says data collection has just
gone too far.

SINGH: I have never been involved in a data collection effort where the conversation was anything
other than how do we get more data. It's never about the ethics of it. It's never about, should we? We
can do these things, but should we? And the answer is absolutely always yes.

TEMPLE-RASTON: And even if the people collecting all this data are doing all this for all the right
reasons, sooner or later the information falls into the hands of someone who uses it in a way that
wasn’t intended.

And this isn’t hypothetical. It has already happened in Ukraine.

(VIDEO FROM UKRAINIAN IT ARMY)

TEMPLE-RASTON: So that’s a crazy video from the Ukrainian IT army, a quasi government backed
group of international hackers. It feels like something straight out of Mr. Robot, only Ukrainian.

(VIDEO FROM UKRAINIAN IT ARMY)


TEMPLE-RASTON: They’ve leaked Russian documents and cracked into Russian television — all in a
bid to try to reach the Russian public and tell them the truth about the war. And that video? It looks
like Clearview AI is in it.

It shows a computer screen, loading up a photo of a dead Russian soldier into what looks like the
Clearview program, and it very quickly identifies him.

(VIDEO FROM UKRAINIAN IT ARMY)

TEMPLE-RASTON: The voice in the video is unveiling a new IT Army initiative: They decided to call
Russian families directly to tell them their kids had died in the war and then send them
photographic proof. Not in a nice way but in a demoralizing way.

Tymchenko said it wasn’t his guys and doesn’t have an explanation for how this could have
happened.

TYMCHENKO: Such volunteers, they managed to identify cell phones and personal numbers and
phone numbers of their relatives, and they called them.

TEMPLE-RASTON: They send pictures of the dead to family or post it on Telegram.

TYMCHENKO: They like admitted they are trying to pretend that they are Russian officers and they
provide information that, you know, your son was dead. I don't know if it's really correct. It's not my
part of job and it's not something which I would like to do, but maybe it's another possibility to
provide a real picture of what is going on here in Ukraine. But we don’t use Clearview AI to make
such things.

TEMPLE-RASTON: We reached out numerous times to the IT Army asking them to explain how they
got their hands on what looks like a Clearview search. We didn’t get a response.

TON-THAT: Look, if I thought it would be used in a really bad way, then I don't think that, you know,
have access to give access to them.

TEMPLE-RASTON: When we asked Ton-That, he said that “the speculation that the IT Army is running
a Clearview AI search does not match any information we have.” We asked him later if he thought it
might be a spoofed search and he didn’t respond.

Whatever happened — whether Clearview or someone else explicitly gave access to the IT Army — is
a little beside the point. An unauthorized person appears to be using the database for an
unintended purpose. Ton-That said a lot of this was beyond his control.
TON-THAT: What we can control as Clearview is, you know, giving access to the right people. A lot of
these agencies and investigators have their own, you know, procedures, and so I can't speak to
exactly how they deploy everything and all that stuff, but they assured me that they want to make
sure that it's done in the right and humane way.

TEMPLE-RASTON: The problem is that assurances don’t really amount to much. Procedures don’t
either. Critics say facial recognition is an unregulated space and if you put something like that in a
war zone, bad things are bound to happen.

After all, it’s war.

TEMPLE-RASTON: I mean, it must be tremendously hard to think about every way someone could
use something that you meant for good could turn it on its head for bad…

TON-THAT: It’s something we think about all the time, right? But when we started Clearview, we
were really thinking of the use cases first. And I think that's our philosophy, is you can always make
technology, but let's focus on the best use case and the highest purpose of something.

TEMPLE-RASTON: Which may be a little naive, because it isn’t just the all volunteer IT Army that is
reaching out to Russian families. Last week, the head of Ukraine’s Ministry of Digital Transformation
said that his team was sending photos to families too, partly to tell them the truth about the war
and partly to tell them if they wanted to pick up the bodies, they could come to Ukraine.

Tymchenko says he doesn’t know of any police investigators abusing their Clearview access. But he
said he wasn’t really looking for it.

TYMCHENKO: To tell the truth, I didn't have such kind of a goal, maybe to check as an administrator.
I don't know. I think maybe there is a possibility to use it in some kind of incorrect way, but so far we
are focusing only on war crimes and we didn't spend much time checking what our guys are doing.

TEMPLE-RASTON: Tymchenko is focused on his job, and it's hard to blame him. The Ukrainian
prosecutor general has logged more than 10,700 war crimes since the war began.

(MUSIC)

TEMPLE-RASTON: Ukraine marks the first major conflict in which we’ve seen facial recognition
technology deployed at such scale. And it arrived there without a procurement process and without
any real oversight.
Jackie Singh says the fact that Clearview has provided this for free masks what the company has to
gain: This could further normalize facial recognition because they are making it ubiquitous.

TON-THAT: And we just crossed the 20,000 search mark. So they're actively using it. And it's about
two months now.

TEMPLE-RASTON: Singh, for her part, worries about the things we haven’t had the space to imagine
yet.

SINGH: When you're providing a technology to a battlefield in the middle of a war zone, you're
introducing additional instability and insecurity to a country that hasn't had this technology that
doesn't have a real process or procedure for using it.

TEMPLE-RASTON: She says we should see Clearview in Ukraine for what it is…

SINGH: We're looking at a defense contractor that is essentially widening its sphere of influence.

(MUSIC)

TEMPLE-RASTON: Last week, Clearview AI agreed to permanently stop selling access to its database
to private businesses or individuals around the U.S. It was part of a settlement agreement with the
ACLU and the State of Illinois. The settlement still needs to be approved by a judge, but it will limit
what Clearview can do with the ever-growing trove of images.

Clearview is also agreeing to stop making the database available to Illinois state government and
local police departments for five years. Ton-That told us it could be a template for other state suits
the company is facing, and he seems sanguine about it all. He says this is how all technologies
mature.

TON-THAT: So like any new technology, I say like the automobile, for example, when they invented
the car, there were no lanes, there were no traffic lights, and seatbelts and those things all came
over time. And that's the life cycle of technology: Something is invented, people have to figure out
what it's good for, what is bad for, and then purity of adoption where people start using it and then
a period of regulation.

TEMPLE-RASTON: But unlike, say, a car, in this case this technology sweeps up intimate details from
all of us without our knowing. It could be my iris scans from Iraq, your wedding pictures in Paris or
the snapshots from the company picnic posted on Facebook.

Leonid Tmychenko says a lot of his officers want their own Clearview logins.
LEO: And we have so, so, so many requests for this.

TEMPLE-RASTON: And he’s already asked for more.

This is Click Here.

(HEADLINES MUSIC)

TEMPLE-RASTON: Here are the big cyber and intelligence stories from the past week.

The US and EU say that the cyberattack that took down satellite communications in Ukraine hours
before the Feb. 24 invasion was the work of the Russian government. While government officials
blamed Moscow for the hack, they stopped short of saying publicly which arm of the Russian
government was behind it. American officials, speaking on condition of anonymity, said that it was
the Russian military intelligence, the G.R.U. — the same group responsible for the 2016 hack of the
Democratic National Committee.

The REvil ransomware group may be back. In January, Russia’s Federal Security Service rounded up
more than a dozen of the gang’s members and said it was cracking down on ransomware actors.
Cyber threat analysts thought it would be the end of REvil at least in its current form, but three
weeks ago researchers discovered REvil ransomware’s servers were back up and running. The
group’s blog is also back. Stay tuned.

And finally, while the commencement at Lincoln College seemed like any other this spring…

DAVID GERLACH: As today’s commencement ceremony is a traditional academic ritual…

TEMPLE-RASTON: This year the graduation at Lincoln, which served a largely Black and Latino
student body, was tinged with sadness. The 157-year-old school in Central Illinois is closing its doors
forever in the wake of a crippling ransomware attack during the winter break. School officials said
the breach didn’t cause the school to close, but it added a death blow to an institution that was
already struggling with dwindling enrollment amid a pandemic. The school’s small endowment
couldn’t make up for the losses. Iranian hackers are thought to have been behind the ransomware
attack.

Today’s episode was produced by Will Jarvis and Sean Powers, and it was edited by Lu Olkowski, with
fact-checking from Darren Ankrom. Ben Levingston composed our theme and original music for the
episode. We had additional music from Blue Dot Sessions.

Click Here is a production of The Record Media.


And we want to hear from you. Please leave us a review and rating wherever you get your podcasts.
And you can connect with us at ClickHereShow.com. I’m Dina Temple-Raston. We’ll be back on
Tuesday.

You might also like