« Even more great clemency news in Oklahoma in wake of 2016 sentencing reform ballot initiative | Main | Rounding up negative reactions to William Barr's nomination to be next US Attorney General »

December 8, 2018

"Bias In, Bias Out"

The title of this post is the title of this recent article authored by Sandra Mayson that I just came across on SSRN. Here is its abstract:

Police, prosecutors, judges, and other criminal justice actors increasingly use algorithmic risk assessment to estimate the likelihood that a person will commit future crime.  As many scholars have noted, these algorithms tend to have disparate racial impact. In response, critics advocate three strategies of resistance: (1) the exclusion of input factors that correlate closely with race, (2) adjustments to algorithmic design to equalize predictions across racial lines, and (3) rejection of algorithmic methods altogether.

This Article’s central claim is that these strategies are at best superficial and at worst counterproductive, because the source of racial inequality in risk assessment lies neither in the input data, nor in a particular algorithm, nor in algorithmic methodology.  The deep problem is the nature of prediction itself.  All prediction looks to the past to make guesses about future events.  In a racially stratified world, any method of prediction will project the inequalities of the past into the future.  This is as true of the subjective prediction that has long pervaded criminal justice as of the algorithmic tools now replacing it.  What algorithmic risk assessment has done is reveal the inequality inherent in all prediction, forcing us to confront a much larger problem than the challenges of a new technology.  Algorithms shed new light on an old problem.

Ultimately, the Article contends, redressing racial disparity in prediction will require more fundamental changes in the way the criminal justice system conceives of and responds to risk.  The Article argues that criminal law and policy should, first, more clearly delineate the risks that matter, and, second, acknowledge that some kinds of risk may be beyond our ability to measure without racial distortion — in which case they cannot justify state coercion.  To the extent that we can reliably assess risk, on the other hand, criminal system actors should strive to respond to risk with support rather than restraint whenever possible.  Counterintuitively, algorithmic risk assessment could be a valuable tool in a system that targets the risky for support.

December 8, 2018 at 03:54 PM | Permalink

Comments

In my experience, prosecutors, judges, and probation are only using these tools because they were pushed upon them in the political environment of the so-called smart and crimes initiatives. The advocates of these tools have been the academic community and the various other groups that believe that law enforcement and the criminal justice system are inherently biased against minorities. Now we are being told that the tools themselves are biased.

I, for one would be happy to give them back. As a career prosecutor in a Bay Area California County, my experience with these tools is that they underestimated the risk of nonappearance or criminality while out on bail, particularly in gang cases. We have been using such a tool for seven years and the only people who put any faith in them are the pretrial services officers who are mandated to do so.

Posted by: David | Dec 9, 2018 11:41:10 AM

Post a comment

In the body of your email, please indicate if you are a professor, student, prosecutor, defense attorney, etc. so I can gain a sense of who is reading my blog. Thank you, DAB