You are on page 1of 8

• hi I'm Malika Bilal and you're in the stream what's the danger of facial recogni9on

technology civil liber9es groups say privacy shouldn't trump protec9ons but
government and police argue the tech ensure safety what do you
• 0:30think I'm a travel Dean today we'll look at how major ci9es are taking very
different approaches to facial recogni9on and explore how the Chinese government is
using this technology to track ci9zens but first how does facial recogni9on even work
check out this
• 0:45clip from a documentary I filed for a j+ facial recogni9on soJware creates data
points that compare facial features databases are built from driver's licenses mug
shots and surveillance video the ini9al technology was
• 1:00developed in the 1960s today it's replacing fingerprin9ng and police lineups facial
recogni9on databases used by law enforcement in the u.s.
• 1:15include 117 million adults that's one in three Americans San Francisco California
is widely known as a tech hub but last week city officials voted to ban the use of facial
recogni9on by police and other government agencies proponents of the
• 1:30tax say it could help prevent or solve crimes but cri9cs contend these systems are
prone to error poin9ng to cases in which they appeared biased against women and
people of color if adopted widely they argue facial recogni9on could be abused to
target rights ac9vists or
• 1:45marginalized communi9es here with us ton packed the debate MaT Kegel serves
as a technology and civil liber9es aTorney at the American Civil Liber9es Union of
Northern California in London silky Carlos the director of Big Brother Watch and
organiza9on tackling issues
• 2:00related to privacy and technology and last but not least in New York City Lily hey
Newman is a security reporter for Wired she focuses on informa9on security digital
privacy and hacking welcome to the stream everyone silky I
• 2:15want to start with you you know we heard a liTle bit there in the intro from that
clip I played that kind of the basics of facial recogni9on technology but what is it and
why does it seem so ubiquitous today facial recogni9on enables people to be
• 2:30iden9fied in real-9me by surveillance cameras that most of which look like
ordinary surveillance cameras and it completely changes the nature of policing so it
enables the police
• 2:45poten9ally to iden9fy and track thousands of people at any 9me which puts an
awful lot of power in state hands and I think really rebalances the rela9onship
between the ci9zen and the state so you know you said in your piece that it's
replacing fingerprin9ng with
• 3:00fingerprin9ng a police officer has to ask you for your fingerprint you have to have
an interac9on they have to have probable cause some kind of purpose with facial
recogni9on we're all being subjected to this kind of perpetual police lineup and we're
all being treated as suspects so lily helped us
• 3:15break this down just even a liTle bit further that it was an excellent a defini9on
from silky there but just for our interna9onal audience so when we're talking about
facial recogni9on what for you makes this even more chilling perhaps than CCTV and
what's the
• 3:30difference between facial recogni9on and using some of the apps on our phone
to either unlock our phone or to use facial recogni9on features on some of
• 3:45the apps we know and love yeah I think the pervasiveness eye as silky says that
comes from the fact that our face is just sort of out there in the world and we're not
necessarily choosing what sees it or what it interacts with you know
• 4:00when we use it in an app or to authen9cate something that we intend or to go
into a building that we want to go into or something like that you know it's an
interac9on where we're making a choice but when we're being sort of passively
viewed through cameras that
• 4:15are placed in public or sort of semi you know public private spaces that we're just
moving through in our daily lives that that becomes kind of out of control where
we're just walking along and our face is like leading in front of us in
• 4:30this new way so MaT what concerns did San Francisco lawmakers specifically
have when it came to facial recogni9on they're concerned about a lot of what's
already been said you know facial recogni9on face surveillance provides and the
government with an unprecedented
• 4:45power to track individuals and groups with hardly liJing a finger on the in terms
of human effort you can i mean imagine a world where you step outside your door
and the government instantly knows who you are where you are who
• 5:00you're associa9ng with and even the expressions on your face whether whether
it's correct or it's just a guess this is a dangerous technology and San Francisco at the
heart of innova9on here recognized that this technology has
• 5:15dangers known dangers right now if it's inaccurate it's dangerous and if it's
perfectly accurate it's dangerous to our democra9c rights MaT so they stepped up in
the acted sorry I didn't mean to interrupt you I thought you were done but as you
were talking I couldn't help but wonder as you were outlining all those things you
know my iPhone and
• 5:30Apple knows all those things even just without the you know face ID technology
that exists but it's not just lawmakers in San Francisco of course that are concerned
Sarah Rasheed tweeted and saying I am for the ban not only is it an invasive way to
profile people but there is no
• 5:45scien9fic evidence that facial recogni9on has helped improve safety or deter
criminal ac9vity I would love to hear from you silky aJer we watch this video from
Brian hopper he sent to us he is the man who draJed the law that banned San
Francisco agencies of course
• 6:00from using this technology take a listen to what he said I think we all intui9vely
understand the dangers of this technology that right now today it would be really
reckless to use it because of its really high error rate
• 6:15but the bigger concern is actually long-term that it's going to become perfect
surveillance that we will not be able to move about society freely I think it would
obliterate our First Amendment protec9ons like the freedom
• 6:30of speech religion assembly and Associa9on I can't detach myself from my face I
can't leave my house without my my my face and everywhere I go I'll know I'll be
tracked silk you forgive me your name
• 6:45sounds so similar to Lily I meant to ask Lily Lily I'm curious your thoughts yeah I
mean I think as you said there's just a really cumula9ve effect of this and I also really
appreciated MaT's point that you know whether this technology
• 7:00succeeds in becoming highly accurate which it currently is not or whether it stays
in the state that it's in it's really dangerous either way because you're either sort of
poten9ally misiden9fying people or having you know
• 7:15people get involved who are totally unrelated to something or the situa9on that
we were hearing from in that clip where you're constantly correctly being tracked so
it's kind of either way there's just this cascade of serious
• 7:30concerns so I want to posit something for all of you and for our audience on the
flip side of this and get your thoughts on it so we reached out to a local sheriff sheriff
office here in the u.s. who their department is using
• 7:45facial recogni9on in the cloud you can take a look at this website here facial
recogni9on mission can be used to iden9fy suspects quickly now this is the
Washington County Sheriff's Office and they actually sent us a statement on what
they're using it for and how it's gone for them they say we can provide
• 8:00that facial recogni9on has been a successful tool for us and it has several success
stories one thing to understand is the facial recogni9on soJware we use is not the
deciding factor when iden9fying individuals if we input a picture of an individual and
• 8:15their as a result an inves9gator takes the informa9on as a lead this is a human
based decision not a computer based decision so MaT when you hear this and you
hear the explana9on for
• 8:30why it might be necessary what do you make about well for starters we know
that from recent repor9ng by the Georgetown privacy Center that many police
departments are actually misusing face surveillance and facial recogni9on systems
and that these systems don't
• 8:45make us safer so to build even a straighforward seeming face surveillance
system you need to get hundreds of thousands of photos innocent individuals who
never consented to be part of a face surveillance database and really right now as we
as other guests have said it's inaccurate
• 9:00technology it's biased technology against people of color and in par9cularly
women of color and many of the quote unquote public safety benefits are theore9cal
but what we do know is that it's biased it's inaccurate and that departments aren't
being transparent about how they're using it
• 9:15there was a report recently and Gizmodo that actually debunked some of
Washington coun9es claims and found that they're not even following Amazon's own
guidance and it's a very paltry weak guidance they're not even following that
• 9:30weak guidance in using the system that doesn't make anyone more safe so I want
to push on just a liTle bit here in this conversa9on and move on to the United
Kingdom because police there and police departments have been conduc9ng street
trials of facial recogni9on
• 9:45cameras earlier this year Metropolitan Police in London find a man 90 Bri9sh
pounds it's about a hundred and fourteen dollars when he protested having his
picture taken so he talked to us about this case because you actually were
• 10:00there you saw it with your own eyes yeah I did it was really shocking and I think
really speaks to this new power imbalance that occurs when police have facial
recogni9on a man came out with
• 10:15the train sta9on and saw a group of us standing with placards and leaflets
legng them know that the surveillance cameras in the area were actually facial
recogni9on cameras and a very very small act of resistance he merely pulled to the
boTom of his jumper up over his
• 10:30chin and we had been tailed by plainclothes police officers who then were
watching how people were responding to us informing them of the facial recogni9on
cameras so very quickly he was swooped by a team of police officers
• 10:45they demanded to know why he had dared to cover his face they demanded his
ID they really riled him and he was a bit aggravated and they gave him a fine and this
has sent chills across Bri9sh
• 11:00society actually it's been the clip of what happened has been viewed millions of
9mes online and I think people are now star9ng to wake up to what's happening in
the UK with this technology and I've become outraged about it this truck so called
trial by
• 11:15the police has been going on for four years now and we've been campaigning
and we're s9ll campaigning for it to come to an end because this is incredibly
undemocra9c undemocra9c it's incredibly unbreached to see a mass
• 11:30surveillance tool like this eroding people's civil liber9es and really changing the
nature of society and freedom in the UK you know MaT it's interes9ng we've outlined
some of the dangers and you know a lot of people who are watching live on YouTube
right now
• 11:45are agreeing with you and also you know poin9ng to other things MRA for
example saying I think the technology on its own isn't the danger thing is it's the
intent of the people who are using it it could vastly improve our lives or turn our
countries into police states and I
• 12:00just want to scroll down a liTle bit in this YouTube chat he also goes on to say at
the same 9me the riches of Silicon Valley who are developing all these technologies
are forbidding their own kids from using it so I think this is a sign that these
technologies should be
• 12:15regulated your thoughts on that yeah so two points first these systems once
they're built and deployed the harm won't be a we won't be able to rein the harm in
and that's exactly unfortunately what we've started to see in places like China in the
history of surveillance the
• 12:30history surrounds in the United States and in other na9ons is a history of
surveillance technologies being turned against people of color against ac9vists and
against immigrants we can fully expect governments to do the same with face
surveillance technology and
• 12:45that's exactly why a coali9on in San Francisco not just of people who
understand technology but 25 different organiza9ons ranging from orgs that
represent immigrants rights to racial jus9ce to the homeless to even criminal
• 13:00defendants a diverse coali9on is what came together here and said all of our
lives depend on the freedom to walk down the street safely without being tracked all
of our lives depend on the freedom to not be logged into a government database
because we're advoca9ng for
• 13:15our own rights in this democra9c society and so while those leaders recognize
that here in the heart of technology they need to place safeguards in place for new
dangerous technologies what really drove this was the community and a diverse
community and that sort of movement I think is a
• 13:30really important point that is possible everywhere that is not just something
that can happen here in San Francisco and we're already seeing the domino effect
places here in California but also across the United States are
• 13:45considering similar similar bans Lily and I think the idea in terms of you know
the poten9al u9lity for policing is that perhaps there could be a way if the technology
were accurate enough and if there were enough insight into sort of how these
algorithms work and how
• 14:00decisions are being made in a really granular way at each stage that it might be
possible to set limits on how you can sort of query one of these databases or use one
of these services so you so police are able to sort of get their
• 14:15match or get their thing that they need without the cascading effects but
because as MaT said once the systems are set up they are there and they're
persistent it's difficult to know how to set those parameters and I think that's
• 14:30why privacy advocates are calling for this sort of pause you know par9cularly
within the US this week but in general because there needs to be some 9me to sort
of society and as a global
• 14:45community discuss how these restric9ons might be possible if they're possible
so you you men9oned something that happened this week and so I want to let our
audience in on it so on Wednesday the House Oversight and Reform CommiTee in
the u.s. held a first hearing on facial recogni9on technology to examine
• 15:00the impact on civil rights and civil liber9es I want you to have a listen to the
founder of the organiza9on algorithmic Jus9ce League and she's speaking to Congress
about the systems and this in this clip in par9cular caught on c-span here in the US
she's speaking to representa9ve Alexandria
• 15:15Ocasio Cortes take a look at this tweet and they've wriTen out the exchange
here because it's so interes9ng AOC starts with our algorithms most effec9ve on
women so I'm gonna scroll down and have you listened to a liTle
• 15:30part of that well then we need I heard your opening statement and we saw that
these algorithms are effec9ve in two different degrees so are they most effec9ve on
women No are they most
• 15:45effec9ve on people of color absolutely not are they most effec9ve on people of
different gender expressions no in fact they exclude them so what demographic is it
mostly effec9ve on quite men and who are the
• 16:00primary engineers and designers of these algorithms definitely white men so so
silky you're outside of the US but you can see what it is the discussion is and that's
happening in the US what do you make of this in these systems and and
• 16:15and the inherent biases that some would say are built-in it's a big problem and
it needs to be really carefully examined we have similar concerns in the UK but we
have pressured the police to do some
• 16:30independent tes9ng of the algorithms they're using here here they're using a
Japanese company would NEC and we've asked if they understand what biases might
be inherent in the technology and
• 16:45they have said basically that they're not interested but there's also issues with
what kind of watch lists are being put together as well so we've we first saw this
technology being used at Nogng Hill Carnival which is a black
• 17:00Bri9sh celebra9on in London and that's where this surveillance you know it was
that community that who we used as guinea pigs for this surveillance two years in a
row which is just just incredible so it's you know also a maTer of not only how biases
technology
• 17:15but who are the people that the police are targe9ng with it but I also think that
some of the technology issues are temporal right and I share the other guests fears
that actually the the beTer this becomes the more perfect at all for oppression it
becomes as well
• 17:30yeah most definitely and you know it's it's worth men9oning that you know I
have a gauge of kind of how much our audience is responding to each show that we
do here at the stream and this one has tons of TwiTer threads and tons of comments
on YouTube so I think you know it's it's generally something that a lot of
• 17:45are concerned about wondering how many of the ques9ons are leJ
unanswered very quickly want to share this tweet with you before we move on to our
next por9on of this show in organic African feminist saying what worries me the most
is that we're asking what can we do a ques9on about developing new
• 18:00technologies rather than also asking why are we doing this and what is
mo9va9ng us to do this not asking the laTer enables the facade of value neutrality
and that goes on into a very lengthy you know thread that you can check out on
TwiTer of course for now though let's
• 18:15dive a liTle deeper into this conversa9on let's look at how the Chinese
government is using a sophis9cated facial recogni9on network to track its own
ci9zens with a focus on the minority Weger Muslim community take a listen to this
comment sent to us
• 18:30by Cindy you a past guest on this show China's use of facial recogni9on
technology fits into a wider paTern of its aTempt to roll out a na9onwide social credit
system whereby every ci9zen is rated on their trustworthiness based on informa9on
• 18:45from big data and you guessed it facial recogni9on in Xinjiang were already seen
this where Jane walkers are being recognized by cameras as they cross the road and
then iden99es displayed on big billboards across the road in China's proud culture
this sort of
• 19:00public shaming can be very effec9ve MaT when you you hear that from from
her there I mean in China's context is there a par9cular fear or is are they taking this a
step further I mean what can you how can you content
• 19:15contextualize that for our audience China should really serve as a lesson an
instruc9ve lesson of what the United States and other na9ons and frankly what the
Chinese government should avoid one of the stories on the Chinese use of
• 19:30this focused on a mosque that previously years before before face surveillance
have been bustling at prayer hours now that mosque is desolate and deserted this
just illustrates very clearly the chilling effect that happens when people know that
going outside means having
• 19:45your face scanned your name logged into a government database and maybe
your iden9ty placed on a watch list for government agents but we would be fooling
ourselves if we didn't think that the United States government had a history of
turning surveillance technologies against these kinds of
• 20:00communi9es we have seen it with everything from license plate readers which
scan vehicles to social media surveillance that the people who are dispropor9onately
targeted by American governments and local and federal
• 20:15governments are people of color they are immigrants we've seen black lives
maTer tracked and we're seeing ice the Immigra9on and Customs Enforcement
Agency deployed these similar kind of tools right now so it's really important and
that's what it's really important to act and defend
• 20:30ourselves right now and that's exactly what San Francisco and now a domino
effect of communi9es are going to do with this par9cularly dangerous technology I
also see the same in the UK I mean I just I just want to point out that even in the trial
phase in the UK
• 20:45police have been using facial recogni9on against peace ac9vists not the most
dangerous people and people with mental health problems as well so we haven't
even seen you know this
• 21:00evolve towards more authoritarianism over 9me I mean it the the start point
and the end point with this technology is really disturbing in nature I think Lily yeah I
also think that China is a
• 21:15really important example in terms of thinking about whether or not what we
know about their system is true it's not that I par9cularly doubt that they could
develop you know this sort of mass scale ubiquitous facial recogni9on I
• 21:30think it's probably all true but the source on it is the Chinese government and
the source on you know their ability to spot someone in a crowd of 50,000 people at
a concert or something all those sort of triumphs that they discuss and you know this
type of footage that
• 21:45you're seeing on the screen this is really all coming from them with not a lot of
you know third party opera9ng or independent audi9ng going on so one of the
dangers you have is if the system isn't as robust or isn't as accurate as
• 22:00they say maybe someone who wasn't even jaywalking gets put up on the
billboard and gets publicly shamed and that type of thing would be really difficult to
bring to light so just again it shows the poten9al dangers either way whether a
system is working
• 22:15as intended or not you know there's s9ll big ramifica9ons I think what's
interes9ng is the idea of op9ng in and op9ng out and when you don't have that
opportunity to opt in so I want you to take a look at this suite of circula9ng online our
producers found this before
• 22:30the show this is MaThew Brennan who treats Wow China Airport face
recogni9on systems to help you check your flight status and find the way to your gate
note I did not input anything it accurately iden9fied
• 22:45my full flight informa9on from my face now that is preTy creepy to me it's eerie
though I could see the argument that it makes things efficient but I want you to have
a listen to a clip from 2017 that shows another side of this in it Lucien who worked as
a head of an aid
• 23:00sta9on in Shanghai explains how facial recogni9on soJware can help regular
passers-by on the street we think running this facial recogni9on system has reduced
the 9me needed to do searches lighten the workload of our
• 23:15staff and made our search is more efficient that lets us help people faster when
they are unsure about their iden9ty and helps us as much as we can to find their
rela9ves so when it comes to talking about demen9a pa9ents or
• 23:30beTer people who are outside and lost facial recogni9on some would say well it
can help in that silky what's your take on the beneficial uses and the op9ng in but but
but not having a
• 23:45choice to opt in I mean it might be that there are some beneficial uses I just
haven't seen them yet and inherently my par9cular issue is that is the inherent risks
with live facial recogni9on in
• 24:00which there is no consent and there is inherently a mass surveillance and a
mass iden9fying tool when you have a one to one face comparison well of course
there's a whole host of things that that can be useful for but we mustn't conflate that
with live facial
• 24:15recogni9on whereby thousands of people can be iden9fied at any one 9me very
quickly just wanted to throw this comment in there I'm always dumbfounded
personally when people say this but Rashad TW saying due to the large popula9on in
China I think it's necessary since it
• 24:30easily reduces crime and perhaps the majority of the Chinese popula9on
accepts it watching over me is fine unless otherwise I'm a criminal which will make
me hate the system we will be watching that's all the 9me we have for today thanks
to our guest MaT silky and
• 24:45Lily we can keep this conversa9on going online by following us on TwiTer we
are at AJ stream we'll see you next 9me

You might also like