Professional Documents
Culture Documents
Turns + Some Blocks
Turns + Some Blocks
Well managed predictive policing softwares are accurate and less racially biased- they reduce mass
incarceration
Melamed 2017 writes
https://www.inquirer.com/philly/news/crime/atlantic-city-risk-terrain-modeling-rutgers-predictive-
policing-joel-caplan-20170810.html Samantha Melamed, August 10, 2017
Atlantic City is not where you might expect to find the next policing success story. It’s in the throes of a
financial crisis that triggered a state takeover. The police force has been pared down to 267 officers from
374 in 2010. Even more severe cuts could be on the way. And yet, lately, Police Chief Henry White Jr.
finds himself optimistic. “The first six months of this year, our violent crime is down about 20 percent
compared to the same time last year. But at the same time our arrests are also down 17 percent,” he
said. “We were able to reduce crime without contributing to mass incarceration.” That’s a first for
White, an old-school cop who started on foot patrol in 1985 and rose through the ranks. What gives him
hope is a high-tech intervention his department has adopted with help from Rutgers University
criminologists. It’s a type of predictive policing called risk-terrain modeling that aims to help police
identify places that attract crime, and intervene to make them less attractive to criminals.
We can correct for false flags by incorporating it into the data models.
Siegel 18
Eric Siegel (PhD, author of Predictive Analytics: The Power to Predict Who Will Click, Buy, Lie or Die).
“How to Fight Bias with Predictive Policing.” Scientific American. February 19, 2018.
https://blogs.scientificamerican.com/voices/how-to-fight-bias-with- predictive-policing/
Third, The black population is ravaged by false flags. As a result of being flagged more often, undeserving
black defendants and suspects are wrongly flagged almost twice as often as undeserving whites. Unlike
the first two points above, this does not necessarily mean the flags themselves are unfairly influenced by
race. Taking this systematic issue into consideration, however, contributes to the greater good. It is an
opportunity to help compensate for past and present racial injustices and the cycles of
disenfranchisement that ensue. This is where predictive policing can de-escalate such cyclic paerns
rather than inadvertently magnify them. Just as we protect suspects by limiting the power to convict
when evidence has been illegally obtained, we can choose to take protective measures on behalf of this
disenfranchised group as well. This is a unique opportunity for law enforcement to be a part of the
solution rather than a part of the problem.
If we make it so, predictive policing could turn out to be a sheep dressed in wolf’s cloth- ing. Unearthing
inequity, it looks threatening—but it presents an unprecedented oppor- tunity to implement new
measures to fight social injustice. Crime-predicting models themselves must remain colorblind by
design, but the manner in which we contextual- ize and apply them cannot remain so. Reintroducing
race in this way is the only means to progress from merely screening predictive models for racial bias to
intentionally de- signing predictive policing to actively advance racial justice.
The problem is racial inequality, not predictive policing. Predictive policing can be enhanced
towards improvements.
Siegel 18
Eric Siegel (PhD, author of Predictive Analytics: The Power to Predict Who Will Click, Buy, Lie
or Die). “How to Fight Bias with Predictive Policing.” Scientific American. February 19, 2018.
hps://blogs.scientificamerican.com/voices/how-to-fight-bias-with- predictive-policing/
But instead of crossing swords, the ultimate resolution would be to agree on measures to
combat racial inequity. Debating whether the COMPAS model deserves the indictment “biased”
distracts from the next course of action. Rather than only veing a predictive model for whether
it worsens racial injustice, let’s enhance predictive policing to ac- tively help improve things. The
key impetus to do so comes directly from the seeming paradox behind this dispute over “bias”
that makes it so sticky to resolve. The oddity itself brings to light a normally hidden symptom of
today’s racial inequity: If predic- tive flags are designed so they indicate the same re-offense
probability for both white and black defendants—that is, designed to be equally precise for both
groups—then, given the higher overall rate of re-offense among black defendants, that group
suffers a
greater prevalence of false flags.
Discrimination
Find a study -
- without predictive policing we have people pick where crimes will happen and they have
more bias
- Data can be checked for unequal treatment and rigorously inspected and regulated, we
can’t do the same for humans
- Cost reductions lead to more money being used to help poor and minority
neighborhoods
LA Times 19
LA Times Editorial Board. “Editorial: The problem with LAPD’s predictive policing.”
The Los Angeles Police Department embraced predictive policing in 2015, but it has
taken until now for the department’s assortment of once-shadowy data-based oper-
ations to be thoroughly veĴed in public. In the end, that’s the essential problem to
work by activists to bring programs like LASER (a data-crunching operation that iden-
tifies crime hot spots) and PredPol (a software program that predicts property crimes)
into the light of day, and they are to be commended; but they are off base in their de-
mands that police scrap the tools entirely. Data, used properly, can enhance public
safety. Police should be encouraged to use it, as long as they are open about what they
are doing, and as long as they heed legitimate criticism and adjust their programs ac-
cordingly. Failure to carefully tailor predictive policing programs invites invasion of
privacy, racial profiling and other unacceptable side effects.