You are on page 1of 1

The student was of African-American descent.

A bizarre
• Facial analysis serves as a springboard for deftly altercation occurs in another scenario, in which cops punish a
broadening the scope of the subject. Algorithms, data man for covering his face while passing a facial recognition
science, machine learning, and so-called artificial camera. Even if the technology were accurate in detecting
intelligence, all of which are doing the bidding of suspects which it isn’t law enforcement officials are plainly
programmers, engineers, and billionaires who are passing utilizing it to label certain behaviors as suspicious, irritate
on their conscious and unconscious biases, are them, and then impose punishment.
highlighted in Coded Bias. As a result, we have a bright • If facial analysis can't reliably distinguish Black faces, and
new world of machines that resembles the past's racist machine learning can't detect human biases in the data that
and sexist power systems. feeds it, then technology clearly has no place wielding such
• Today's innovation is dependent on judgments made not massive power over individuals.
just by those currently working in the area, but also by • Coded Bias is an important warning, but it's not meant to
those who set the groundwork and brought us to where make one feel hopeless as one return to doom-scrolling on
we are now, steered deliberately or unknowingly by their ones smartphone. Buolamwini personifies optimism as the
own beliefs about what science should be and who it founder of the Algorithmic Justice League, a campaign
should serve. group dedicated to combating unfairness in decision-
• There is a noticeable continuity between who was in making algorithms. Throughout the film, she and the
power then and who is in charge now, from yesterday's filmmakers seek out and join forces with other activists,
(mainly white, largely male) thinkers to today's (mostly many of whom are Black and women, who are chipping
white, mostly male) tech billionaires. As Buolamwini away at hidden monoliths of control in places like
puts it, "data is a reflection of our history." "Our apartment blocks, hair salons, and MIT labs, as well as in
algorithms house the ghosts of the past.“ the halls of power
• Major take away from Coded Bias, is that it's time to
• The video joins activist organisation Big Brother Watch
confront it. We didn't read the terms and conditions and
on the camera-encrusted streets of London, where they
opted into an unimaginably large monitoring project
are launching high-level legal challenges against mass
managed by a handful of profit-driven firms, billionaires,
monitoring and pounding the sidewalk assisting passers-
and states in which democratically elected lawmakers and
by who have been arrested by police after a hidden
the people themselves are steps behind. However,
PAGE 01

PAGE 02
camera pointed them out. When a facial analysis system
Buolamwini and a global coalition of campaigners have
misidentified a schoolboy, a gang of un-uniformed
already influenced genuine change, governments grapple
officers hauled him off a crowded street.
with the repercussions.

You might also like