July 25, 2018
"Procedural Justice and Risk-Assessment Algorithms"
The title of this post is the title of this article recently posted to SSRN and authored by A.J. Wang. Here is the abstract:
Statistical algorithms are increasingly used in the criminal justice system. Much of the recent scholarship on the use of these algorithms have focused on their "fairness," typically defined as accuracy across groups like race or gender. This project draws on the procedural justice literature to raise a separate concern: does the use of algorithms damage the perceived fairness and legitimacy of the criminal justice system?
Through three original survey experiments on a nationally-representative sample, it shows that the public strongly disfavors algorithms as a matter of fairness, policy, and legitimacy. While respondents generally believe algorithms to be less accurate than either psychologists or statutory guidelines, accuracy alone does not explain their preferences. Creating "transparent" algorithms helps but is not enough to make algorithms desirable in their own right. Both surprising and troubling, members of the public seem more willing to tolerate disparate outcomes when they stem from an algorithm than a psychologist.
July 25, 2018 at 11:14 AM | Permalink
Machines are 100 times better than living beings. Ride a horse to work, on a snow day. Then do the same in a Mercedes. That thing is German, and will criticize your driving. It will beep, show a message to take a rest, and display an icon of a cup of coffee, if your driving is getting too vague for its algorithm.
Sentencing algorithms should be owned, written, and openly published by the legislature. The latter should take public comments and case histories, then update the algorithm every two years. The enhanced algorithms should apply retrospectively to prior sentencing.
Posted by: David Behar | Jul 25, 2018 12:37:45 PM