Professional Documents
Culture Documents
Ep. 51: Axon still wants to put Taser drones in your kid’s school
[MUSIC]
[RADIO STATIC]
RADIO-1: We got a guy with a long rifle. We dunno where the hell he's at…
RADIO: We may have the suspect pin down northwest corner of the building. On the second
floor…
They were in a standoff with a sniper who had killed five officers.
RADIO: Those officers by squad car 2091, you’re facing the wrong direction. You need to get
some cover.
You know one of those robots that bomb techs use to defuse explosives.
But in this case the robot wasn’t there to take a bomb apart.
[MUSIC]
MUSIC BUMP
1
2
TEMPLE-RASTON: This incident was incredibly controversial because it blurred the line
between policing and warfare.
SMITH: Gun control in the United States, as a political movement, has not made a
significant impact, and I wanted to take a different approach. What if I could make the bullet
obsolete?
TEMPLE-RASTON: This is Rick Smith. He’s the CEO of Axon. It provides technology to police
departments: Tasers, body cams…
SMITH VIDEO: This response could bring a situation under control in a matter of seconds.
TEMPLE-RASTON: Before Rick Smith’s announcement Axon’s AI ethics board had spent
more than a year looking at whether putting tasers on drones was a good idea….
FRIEDMAN: And so the question was, is that a line that we just don't wanna cross?
[MUSIC]
TEMPLE-RASTON: I’m Dina Temple-Raston and this is Click Here, a podcast about all things
cyber and intelligence.
Today, we take you in the room with the Axon AI Ethics Board as they grapple with the idea
of putting weaponized drones on the streets with police.
TEMPLE-RASTON: Click Here obtained a report that provides, for the first time, an inside
look at the board’s deliberations.
And what it provides is a window into a debate swirling around AI-enabled devices.
2
3
Axon’s AI ethics board eventually decided that putting a taser on a drone was just too
dangerous, too ripe for abuse.
Just weeks later after they said the project should be shelved, Axon’s CEO announced
publicly that the company had a plan to stop mass shootings by putting taser equipped
drones in schools.
[BREAK]
TEMPLE-RASTON: AI ethics boards are kind of a thing now because there are so many
ethical risks associated with technologies powered by artificial intelligence. Drones use AI
software to perceive their surroundings, track objects and provide analytical feedback in
real time.
But the algorithms that power AI can be biased. The cameras it controls can invade privacy.
And because these kinds of devices are built to operate at scale, any problems they have
affect lots of people — all at once.
[MUSIC]
BBC NEWS: So, could artificial intelligence play a vital part in providing some answers?
ACLU: But here’s the catch, these cameras post a major threat to our privacy….that turns
our cameras from dumb recording devices to smart AI security guards watching us at every
moment.
CBN NEWS: In China the government uses AI to track and control the population…
TEMPLE-RASTON: All this is one of the reasons why Axon, one of the country’s leading
manufacturers of policing technology, decided to create an independent review board, to
3
4
advise the company on how to develop AI-powered products that didn’t trample on civil
liberties.
FRIEDMAN: they were operating in a space that was fraught and where they got criticized a
lot and thought it would be good to have an independent outside body that, uh, would
guide them.
TEMPLE-RASTON: This is Barry Friedman, he’s a law professor at New York University and
the director of their Policing Project.
FRIEDMAN: We were a little skeptical of it, frankly, but we, we agreed to listen. It's
interesting, a number of civil liberties and racial justice organizations declined to
participate. And I think over the period of time in which the ethics board was operating and
operating well, everybody felt that good work was being done.
TEMPLE-RASTON: Good work like convincing Axon not to put facial recognition software on
its police body cams. Because it just had too many issues. It misidentifies the faces of
women and people of color.
Or having the company modify plans to use high speed license plate scanners.
AXON VIDEO: We’re going to test the Fleet 3 system with license plate reading technology
against some of the fastest cars in the world. Ferrari!
TEMPLE-RASTON: The AI ethics board thought it could be used to illegally track people.
So when Axon’s chairman asked them to take a look at something they called Project ION,
an effort to put Tasers onto drones.
The board thought this would be another episode in which they could prevent a technology
from getting ahead of itself.
FRIEDMAN: Weaponizing drones and robots has been a frontier, right? And so the question
was, is that a line that we just don't wanna cross?
4
5
FRIEDMAN: Drone has value as an eye in the sky. That itself is incredibly invasive, but to
then think that that's gonna be able to zap people that it sees, like, that's, that's disturbing
science fiction.
TEMPLE-RASTON: Which is I think why everybody's first reaction when they hear about
weaponizing drones is the one like, nah, this seems like a bridge too far.
FRIEDMAN. Sure. I mean it, you know, some level conjures up, you know, images from Star
Wars or something. It's not so good.
STAR WARS: You see, Lord Vader, she can be reasonable. Continue with the operation. You
may fire when ready. What?
[PLANET EXPLOSION]
And it might surprise you to know that the board didn’t dismiss the idea of weaponized
drones out of hand.
According to the report, it actually believed there could be some very limited cases in which
the police might find drones helpful.
Specifically situations like the one in Texas, with a highly experienced sniper targeting
police.
MAX ISSACS: You know, often I, I think that people working in technology can ignore, uh,
the social context in which these products are used.
He did a lot of the research that went into the board’s assessment.
5
6
ISSACS: You can't just develop the technology with some safeguards. You need to look at
the types of policies that agencies are enacting, the way that they're enforced, the way that
officers are trained, all of the pieces kind of combined.
TEMPLE-RASTON: Which is why the board wasn’t just looking at adding wings to a taser.
It had to take into account some of the issues that dogged the tasers themselves.
ISSACS: We've heard some really disturbing reports about the abuse of tasers against
school children, elderly people, uh, even against people who, who have been restrained,
who are in handcuffs. Uh, and, and given that context, given the fact that the taser exists
today, and we haven’t found a way to prevent these abuses from happening, those concerns
are only magnified.
TEMPLE-RASTON: Back in 2017, when he was just 14 years old…And he was standing on a
hill near an apartment block with a friend when Cincinnati police officer Kevin Kroger pulled
up in his cruiser.
TEMPLE-RASTON: This is from the body cam footage taken that day.
Dawson is about 5’5”, 120 pounds, there are braces on his teeth.
6
7
And then he wriggles free and takes off running down the hill.
Kroger, who was looking for a suspect who had stolen a car, doesn’t tell Dawson to stop as
he’s supposed to do.
[TASER SOUND]
[TASER SOUND]
DIONDRE LEE: He says it was like everything seized up in his body, and he had no control.
LEE: All the momentum and everything was still going. So, when he hit the ground, he could
just watch himself hit the ground with no control whatsoever.
LEE: First time I watched the bodycam videos, it made me very, very, angry.
[MUSIC]
7
8
LEE: It made me very angry because it hurt my wife's feelings, and she gave birth to that
man, to that young man. So, when you hurt him, you hurt a part of her.
[MUSIC]
The Lee’s ended up filing a lawsuit against the police for excessive force.
And they settled for an undisclosed sum. The police never had to admit wrongdoing.
Dawson’s dad says he’s still not over it, all these years later.
LEE: Unfortunately, he, he has a real, he has a real quick trigger. His, he has a lot of triggers
within his anger and a lot of that has come post incident.
TEMPLE-RASTON: He’s 20 now, works in his dad’s business, and he still hasn’t put it behind
him. And neither has his dad.
LEE: I understand you gotta protect and serve and go home to your family, but you, but
every 14 year old African American that you run into is not used to doing them the way you
did my child. That was his first encounter and only encounter.
[MUSIC]
TEMPLE-RASTON: It's cases like these that concerned members of Axon's AI ethics board
when it began considering scenarios under which mounting a taser on a drone would be a
good idea.
FRIEDMAN: It gets used disproportionately on black and brown folks. That just always is the
case.
FRIEDMAN: The idea of a control panel with a taser drone felt a little bit too much like a
gaming platform.
8
9
TEMPLE-RASTON: And even if they had thought of every eventuality, how could they ensure
that the rules of engagement would be followed?
FRIEDMAN: 18,000 agencies in the country, as you might guess. There's a wide variety in
the quality of those agencies.
TEMPLE-RASTON: Which was one of the problems, the report said, they just couldn’t
overcome.
FRIEDMAN: There was a group of us that were just concerned that as well as we could
design this, and if designed well, as much as we believed it was something the world could
benefit from, we couldn't trust the overall variance in policing to make this a commercially
viable product.
TEMPLE-RASTON: So after a year of study. The ethics board decided to put it to a vote.
FRIEDMAN: And I think it was fraught in part because all of us understood the compelling
use case. And I wanna stress that we unanimously understood why you might want to be
able to use less lethal force from a distance to save lives.
And the police on the board, you know, told stories about horror shows they'd experienced
in their departments where somebody had been shot and their lives might have been saved
with something like this.
FRIEDMAN: We all kind of went around the room and said how we felt. And then we had
back and forth. And then after that, uh, we took a vote.
9
10
FRIEDMAN: We decided not to go ahead with the pilot, uh, and, and it, it was a tough
decision. I know Rick was very disappointed. This has really been one of his, his dreams. I
think.
TEMPLE-RASTON: When we come back, how the idea of tasers on drones went from general
police work to inside the schools.
FRIEDMAN: This one just went completely off the rails and I think for no reason whatsoever.
[BREAK]
TEMPLE-RASTON: The summer of 2022 was a terrible one for mass shootings.
[MONTAGE]
TEMPLE-RASTON: The Buffalo and Uvalde shootings happened just a couple of weeks after
Axon’s ethics board came to the conclusion that tasers and drones shouldn’t mix.
Which is why it seemed so odd when Axon’s CEO Rick Smith released this video.
SMITH: We’ve talked about these horrific school shootings. They just keep happening…
TEMPLE-RASTON: He announced that Axon would be developing a taser drone for schools.
SMITH: And if that human operator gives the go signal, then the drone rotors up, it
immediately deploys into the seam and there, together with the operator, it can help
identify the threat and under direct control of that human operator, it could incapacitate
that threat… I believe this is how we can end school shootings.
10
11
JORDAN-MCBRIDE: Absolutely not. Like that wasn't even, I don't think I ever crossed
anyone's mind as a potential case scenario.
She’s a community organizer in Chicago and she was on the Axon ethics board.
She was one of the four members of the board who voted to give that Taser drone pilot
program a try.
And she thought that Rick Smith’s school taser plan was ill-advised…
JORDAN-MCBRIDE: It just felt fanciful to me and it felt like an emotional response that
wasn't completely thought through.
TEMPLE-RASTON: And we should say here that we asked to interview Rick Smith for this
episode… and Axon declined.
Mecole said she tried to imagine how this would even work.
JORDAN-MCBRIDE: I was thinking like even in Chicago, like how many school buildings are
in Chicago, it's impossible for you to know which school would potentially be victim to this,
right.
JORDAN-McBRIDE: And so now are we talking about literally putting a drone in every single
school across America? I thought about the, the, the amount of money that would be, you
know, I thought about the over surveillance of that.
TEMPLE-RASTON: To figure out where the shooter was, Axon would either need access to
any cameras that were there, or it would need someone to install them.
Though Axon’s video of how it would work that made it seem a lot simpler than that.
11
12
SMITH: Working through partners, we're gonna activate any camera in any school, church,
or public building so that it can be easily shared with first responders. And then finally,
we're going to be building a taser drone system that can be remotely operated and pre in
placed to stop threats in less than a minute.
TEMPLE-RASTON: But actually… Think about every school shooting you’ve ever read about.
The gunman gets into the school, armed to the teeth… he’s wearing body armor, they
barricade themselves in a classroom…
Police thought it was too dangerous to go in. They waited down the hall from where the
gunman was for more than 70 minutes.
FRIEDMAN: You know, drones have to get around and shooters go into rooms and close
doors.
FRIEDMAN: And the company's answer was, well, we'll just, you know, we'll cut holes in all
the doors so the drone could get through. I mean, you know, you're trying a little too hard.
Then, people that go in and commit these shootings have enough body armor on. It's not
like the drone’s gonna work.
TEMPLE-RASTON: For the board, the taser drones in schools was a bridge too far.
FRIEDMAN: We could have talked as a board and we could have talked with the company,
but there was this great eagerness on Rick's part to, you know, get this idea out there in the
aftermath of Uvalde. And, um, it just, we couldn't operate that way.
Axon’s AI ethics board began a chain of phone calls, they held an emergency meeting.
12
13
FRIEDMAN: Trust was so much of the work that we did. Uh, board members were often
attacked from the outside world for, you know, working with Axon, but we believed that we
were making progress and it was the right thing to do. And it's interesting because in the
aftermath of the collapse of the board, I got a number of emails from folks in different
places in the civil liberties and racial justice community saying it's a good thing that you did
the work and it's a good thing that you stopped when you did, given the circumstances.
TEMPLE-RASTON: Axon’s CEO responded by putting the whole stun-gun taser project on
hold.
[TASERCON MUSIC]
TEMPLE-RASTON: But the reason we’re talking about this now, the reason we have details
of the ethics board’s deliberations?
Is because Rick Smith is giving a keynote address at a conference this week in Las Vegas.
Once again pushing the idea of introducing armed drones into schools.
The company told Click Here in a statement that taser drones in schools is an idea, not a
product, and it’s a long way off.
In the meantime, Rick Smith has said publicly that he’s engaging with teachers and school
boards and continuing to explore the idea.
JORDAN-MCBRIDE: I know that they have a new ethics board or, or something like the ethics
board. You know, my hope and my prayer is that, those individuals are asking the hard
questions, and that their design team, is really pushing back and, and trying to answer for
all of these what ifs.
13
14
TEMPLE-RASTON: The ethics board’s report, with all the what ifs is expected to be publicly
released this week.
[B SEGMENT MUSIC]
DINA TEMPLE-RASTON: There’s another piece of cutting edge technology that’s caught the
attention of school administrators recently.
You’ve probably heard of it, users type in what they want — and in seconds it churns out
human-like text on everything from Shakespeare to complex equations.
And a bunch of school districts don’t like it too much, since some students are using it to
make quick work of their homework.
But where the program is developing a very devoted following is in cyber criminal circles.
CRANE HASSOLD: My initial concern was, wow, this thing can really quickly generate some
really sophisticated email lures that could be used to exploit people.
TEMPLE-RASTON: This is Crane Hassold. He’s the director of threat intelligence at Abnormal
Security, they are all about keeping emails safe.
Crane says ChatGPT is helping cyber criminals craft really good phishing emails.
You know the ones that usually show up in your mailbox with syntax and spelling errors.
No more! Just tell ChatGPT to write a persuasive email requesting a wire transfer and…
14
15
HASSOLD: So it starts out with Dear employee, I hope this email finds you well. I'm writing
to request a wire transfer to be paid to our supplier…
HASSOLD: And it goes on and on with multiple paragraphs of context and background…
TEMPLE-RASTON: As it's writing the paragraphs, is the cursor sort of going across the
screen just taking care of it?
HASSOLD: That's exactly what it's doing. It's literally typing it in front of my eyes.
TEMPLE-RASTON: But what it’s missing are those syntax problems and red flags that
scream – this is a scam. Watch out!
ALEXANDER LESLIE: It may not be the best, most technically, you know, um, sophisticated
form of malware. But… the tone of the posts among threat actors was incredibly optimistic.
They were very enthusiastic… because … the opportunities effectively are unlimited.
TEMPLE-RASTON: Full disclosure, Click Here is part of Recorded Future News, an editorially
independent arm of Recorded Future.
Alexander says code produced by ChatGPT might not be ready to ship, but it makes the
business of cyber crime a whole lot easier.
15
16
LESLIE: What it's doing is it's lowering the technical barrier to entry for threat actors that
say don't have technical skills, they don't know how to program, they don't know where to
look, so they can go to ChatGPT and say, hey, can you build me a script for a program that
can hypothetically steal browser information or modify clipboard data.
A lot of this may still be in the proof of concept stage. But criminals see ways to use the
program to help steal credentials or drain cryptocurrency wallets. And in response, ChatGPT
has set up a few guardrails. Actual requests to write malware might now result in a ‘safety
prompt’ which rejects the query. But with enough persistence, with enough write-arounds or
creativity, it’s still possible.
[MUSIC]
LESLIE: If you know how to game ChatGPT, if you know the correct syntax, the correct tools,
um, you can get around a lot of the restrictions that ChatGPT has on its community
standards.
[HEADLINES MUSIC]
TEMPLE-RASTON: Here are some of the week’s top cyber and intelligence stories from The
Record:
16
17
Solaris, a large darknet marketplace focused on drugs and illegal substances, has been
taken over by a smaller competitor named Kraken, according to the blockchain monitoring
group Elliptic. Kraken claims to have hacked its rival in January, on Friday the 13th.
There’s evidence to suggest something happened because the Tor site of Solaris currently
redirects to Kraken and Elliptic says there hasn’t been any activity in the crypto site
associated with the group.
The Solaris marketplace appeared on the scene pretty recently… not long after law
enforcement managed to shutter Hydra, a massive darknet marketplace we’ve talked about
in the past. Kraken shared logs that purportedly confirm it has taken full control of Solaris
and said that Solaris’ bitcoin wallets have been deactivated.
—-
—-
17
18
fact checker. Ben Levingston composes our theme, and we use other music from Blue Dot
Sessions. Gabriela Glueck is our intern.
And we want to hear from you. Please leave us a review and rating wherever you get your
podcasts, and connect with us by email: Click Here [at] Recorded Future [dot] com or on our
website at ClickHereshow [dot] com. I’m Dina Temple-Raston. We’ll be back on Tuesday..
18