« "Unwarranted Disparity: Effectively Using Statistics in Federal Sentencing" | Main | Ohio Gov Kasich officially pushed back nine executions as lethal injection litigation comes before en banc Sixth Circuit »

May 1, 2017

Spotlighting again the use of risk-assessment computations at sentencing (under an inaccurate headline)

Adam Liptak has this new column discussing the Loomis risk-assessment sentencing case pending SCOTUS cert review, but the column bears the inaccurate headline "Sent to Prison by a Software Program’s Secret Algorithms."  As of this writing, software programs alone have not sent any persons to prison, not in the Wisconsin case before SCOTUS or any other that I know about.  Software may be making recommendations to sentencing decision-makers, and that certainly justifies scrutiny, but we have not quite yet reached the brave new world that this headline suggests.  That said, the headline did grab my attention, and here are parts of the article that follows:

[A] Wisconsin man, Eric L. Loomis, who was sentenced to six years in prison based in part on a private company’s proprietary software. Mr. Loomis says his right to due process was violated by a judge’s consideration of a report generated by the software’s secret algorithm, one Mr. Loomis was unable to inspect or challenge.

In March, in a signal that the justices were intrigued by Mr. Loomis’s case, they asked the federal government to file a friend-of-the-court brief offering its views on whether the court should hear his appeal.

The report in Mr. Loomis’s case was produced by a product called Compas, sold by Northpointe Inc. It included a series of bar charts that assessed the risk that Mr. Loomis would commit more crimes. The Compas report, a prosecutor told the trial judge, showed “a high risk of violence, high risk of recidivism, high pretrial risk.” The judge agreed, telling Mr. Loomis that “you’re identified, through the Compas assessment, as an individual who is a high risk to the community.”

The Wisconsin Supreme Court ruled against Mr. Loomis. The report added valuable information, it said, and Mr. Loomis would have gotten the same sentence based solely on the usual factors, including his crime — fleeing the police in a car — and his criminal history.

At the same time, the court seemed uneasy with using a secret algorithm to send a man to prison. Justice Ann Walsh Bradley, writing for the court, discussed, for instance, a report from ProPublica about Compas that concluded that black defendants in Broward County, Fla., “were far more likely than white defendants to be incorrectly judged to be at a higher rate of recidivism.”

Justice Bradley noted that Northpointe had disputed the analysis. Still, she wrote, “this study and others raise concerns regarding how a Compas assessment’s risk factors correlate with race.” In the end, though, Justice Bradley allowed sentencing judges to use Compas. They must take account of the algorithm’s limitations and the secrecy surrounding it, she wrote, but said the software could be helpful “in providing the sentencing court with as much information as possible in order to arrive at an individualized sentence.”

Justice Bradley made Compas’s role in sentencing sound like the consideration of race in a selective university’s holistic admissions program. It could be one factor among many, she wrote, but not the determinative one.

In urging the United States Supreme Court not to hear the case, Wisconsin’s attorney general, Brad D. Schimel, seemed to acknowledge that the questions in the case were substantial ones. But he said the justices should not move too fast. “The use of risk assessments by sentencing courts is a novel issue, which needs time for further percolation,” Mr. Schimel wrote.

He added that Mr. Loomis “was free to question the assessment and explain its possible flaws.” But it is a little hard to see how he could do that without access to the algorithm itself. The company that markets Compas says its formula is a trade secret. “The key to our product is the algorithms, and they’re proprietary,” one of its executives said last year. “We’ve created them, and we don’t release them because it’s certainly a core piece of our business.”

Compas and other products with similar algorithms play a role in many states’ criminal justice systems. “These proprietary techniques are used to set bail, determine sentences, and even contribute to determinations about guilt or innocence,” a report from the Electronic Privacy Information Center found [available here]. “Yet the inner workings of these tools are largely hidden from public view.”...

There are good reasons to use data to ensure uniformity in sentencing. It is less clear that uniformity must come at the price of secrecy, particularly when the justification for secrecy is the protection of a private company’s profits. The government can surely develop its own algorithms and allow defense lawyers to evaluate them. At Rensselaer last month, Chief Justice Roberts said that judges had work to do in an era of rapid change. “The impact of technology has been across the board,” he said, “and we haven’t yet really absorbed how it’s going to change the way we do business.” 

Some prior related posts on Loomis case:

May 1, 2017 at 12:38 PM | Permalink

Comments

Happy Law Day!

Posted by: Joe | May 1, 2017 12:57:00 PM

The legislature should write an open algorithm. It should be subject to public comment for the standard period of time. It should replace all sentencing by judges. These are totally biased to coddle the customer, the criminal. They don't know anything about anything.

Then follow up studies should result in revisions of the algorithm every 5 years, based on outcomes.

One element that is needed is biological markers, such as physiological testing, and future genetic testing of factors that raise risk of violence. The most powerful known, today, is the consumption of alcohol. Here is a review of current markers of long term consumption. These are in the blood.

https://pubs.niaaa.nih.gov/publications/ahrw18-2/131-135.pdf

Machines are 100 times better than living beings. Anyone who disagrees should be forced to commute to work, on a snow day, riding a horse. The guy who said the use of AI in sentencing "must stop now," take his car. Make him ride a jackass.

Errors in data entry, or the entry of false information should subject the sentencing authority to civil tort liability with exemplary damages. Such error turns the sentence into a false imprisonment. If the legislature writes an algorithm that is mistaken, according to currently known information, it should be subjected an aggregate claim. To deter.

Posted by: David Behar | May 1, 2017 1:30:39 PM

Up and down the City road
In and out the Eagle,
That’s the way the money goes,
Pop! Goes the weasel

The Eagle was a London Tavern where the poor spent money they could not afford to drown their sorrows.

Joe, when is Pop Goes the Weasel Day?

Posted by: David Behar | May 2, 2017 4:01:59 AM

Post a comment

In the body of your email, please indicate if you are a professor, student, prosecutor, defense attorney, etc. so I can gain a sense of who is reading my blog. Thank you, DAB