Monday, August 21, 2017

Miscalculated

I was listening to the news on my way home from work, when a story came on about how a computer algorithm designed to reduce the reliance on cash bail to determine who stays in jail while awaiting trial and who goes free.

This past July, one Edward French was murdered in San Francisco. One of the suspects is a young man who was on felony probation for breaking into cars, gun possession and parole violations. The judge who released him used the suspect's "public safety assessment" in her determination to release him into an "assertive case management" pretrial program. The reporter's assessment is harsh: "In the case of Ed French, the algorithm failed."

But implicit in that assessment is an idea that is never actually asserted within the story itself. That the role of the public safety assessment is to determine, as a yes or no question, whether a suspect will return to court for their trial without attempting to flee or committing another crime.

The [Arnold] foundation [, which created the tool] says the algorithm generates gender-and-race-neutral "evidence-based data" on which defendants should be released before trial offering judge's "reliable, predictive information about the risk that a defendant released before trial will engage in violence, commit a new crime, or fail to return to court."

In the case of French, a miscalculation ended in murder.
Did A Bail Reform Algorithm Contribute To This San Francisco Man's Murder?
And here again, is that implication that the job of the algorithm is to rate people as "safe" or "unsafe," and that the young suspect in Mr. French's murder was mistakenly, and disastrously, rated "safe." But that's not the way a risk assessment algorithm works, especially not one that outputs a score. The Arnold Foundation's tool takes nine factors and uses them to effectively express, as a score, a probability that someone will flee the jurisdiction or commit another crime. The only way in which this could be a determining score, effectively offering a "safe" or "unsafe" answer, is if those factors basically created a hard line, on one side of which the flight and recidivism rate was effectively zero, and on the other side, it was effectively 100%. But the world isn't that cut and dried. The lowest-risk category could be a .5% change of a suspect fleeing or committing a crime while out of jail.

Which of course, is cold comfort to the partners, friends and relatives of the slain, who see with perfect hindsight, every little reason why the one-in-two hundredth person, the person who killed their loved one, should have remained in jail.

And that's the problem with the bail bond system. The expectation that it's a tool to create 100% safety, and that anything less is a failure. And this an expectation that news outlets feed into. Even National Public Radio, which ran a series on why the cash bail system is simply a trap for the poor and a low-risk cash cow for the bail bonds industry, can't see its way clear to an angle other than the human interest of a life snuffed out, the suffering and anger of those left behind - and the fear that an algorithm intended to look past considerations such as race and class might convince a judge to let someone out of jail who goes on to kill someone who seems undeserving of that fate, because of a "miscalculation."

No comments: