Giving Compass' Take:
- The Marshall Project highlights a study contributing to the debate over racial bias in risk assessment tools widely used in courtrooms.
- How can donors fight the use of racist algorithms? What role can you play in addressing racial inequality in the justice system?
- Learn why diversity data is needed to fight segregation.
What is Giving Compass?
We connect donors to learning resources and ways to support community-led solutions. Learn more about us.
In this data-driven era, if you’ve been arrested it’s increasingly likely the judge deciding whether to send you home or to jail to await trial will consult actuarial math. Specialized algorithms—called risk assessment tools—plumb your history, demographics, and other details to spit out a score quantifying how likely you are to commit another crime or to show up at your next hearing. But these tools have come under fire for treating people of color more harshly.
According to a from the Center for Court Innovation, a justice reform agency based in New York City, those criticisms may be well-founded. “There’s no way to square the circle there, taking the bias out of the system by using data generated by a system shot through with racial bias,” said Matt Watkins, senior writer at the Center and one of the authors of the paper.
Risk assessments are pitched as “race-neutral,” replacing human judgment—subjective, fraught with implicit bias—with objective, scientific criteria. Trouble is, the most accurate tools draw from existing criminal justice data: what happened to large numbers of actual people who were arrested in any particular location. And the experience of actual people in the criminal justice is fraught with racial disparities and implicit bias.
Read the full article about racist algorithms by Beth Schwartzapfel at The Marshall Project.