The Marshall Project

Can Racist Algorithms Be Fixed?

A new study adds to the debate over racial bias in risk assessment tools widely used in courtrooms.

In this data-driven era, if you’ve been arrested it’s increasingly likely the judge deciding whether to send you home or to jail to await trial will consult actuarial math. Specialized algorithms—called risk assessment tools—plumb your history, demographics, and other details to spit out a score quantifying how likely you are to commit another crime or to show up at your next hearing. But these tools have come under fire for treating people of color more harshly.

According to a out Monday from the Center for Court Innovation, a justice reform agency based in New York City, those criticisms may be well-founded. “There’s no way to square the circle there, taking the bias

You're reading a preview, sign up to read more.

More from The Marshall Project

The Marshall Project6 min read
The Federal Prisoner Transit System—aka “Diesel Therapy”—Is Hell
We federal inmates had been on the privately operated prison bus for more than three hours as it wound its way through Alabama, all of our hands and feet shackled together. There was no water or air conditioning, and the Southern summer heat was swel
The Marshall Project3 min readSociety
Why are the Feds Arresting More Non-Citizens?
Most of the increase comes from immigration charges, not violent crime or drugs, a new report finds.
The Marshall Project5 min read
On Death Row, There's No Such Thing As Closure
It is August of the year 2000, and I am cold. I know the sun is shining outside, but there are no windows in the courtroom. The lighting is subdued, darker in the back where spectators sit and lighter in the front where tragedies play out—like a the