Tuesday, October 08, 2024
CCJ releases new report on "The Implications of AI for Criminal Justice"
The Council on Criminal Justice today released this notable report put together following a summer convening of researchers and various stakeholdersto discuss the potential and pitfalls of artificial intelligence for criminal justice systems. Here is part of the report's introduction:
The rapid advancement of artificial intelligence (AI) technologies has implications for every sector of society, including the criminal justice system. As AI tools for investigation, adjudication, prioritization, analysis, and decision-making proliferate and evolve, understanding their potential benefits and risks becomes increasingly important.
In June 2024, the Council on Criminal Justice (CCJ) convened a group of experts and stakeholders to discuss the implications of AI for the U.S. criminal justice system. The meeting brought together a diverse group of three dozen leading stakeholders from across ideologies, disciplines, and sectors of the system — policymakers, practitioners, researchers, technologists, and advocates — for two days of discussion and the examination of three use cases. The event was hosted by the Stanford Criminal Justice Center at the Stanford University School of Law....
The key goal of the convening was to jump-start a national conversation about how to integrate AI into criminal justice in ways that promote justice, efficiency, and effectiveness and avoid exacerbating existing problems or creating new ones. This report summarizes key themes from the convening.
October 8, 2024 in Procedure and Proof at Sentencing, Technocorrections | Permalink | Comments (0)
Friday, July 26, 2024
"Algorithms in Judges’ Hands: Incarceration and Inequity in Broward County, Florida"
The title of this post is the title of this article recently posted to SSRN authored by Utsav Bahl, Chad M. Topaz and others. Here is its abstract:
Judicial and carceral systems increasingly use criminal risk assessment algorithms to make decisions that affect individual freedoms. While the accuracy, fairness, and legality of these algorithms have come under scrutiny, their tangible impact on the American justice system remains almost completely unexplored. To fill this gap, we investigate the effect of the Correctional Offender Management Profiling for Alternative Sanctions (COMPAS) algorithm on judges’ decisions to mandate confinement as part of criminal sentences in Broward County, Florida.
Our study compiles a novel dataset of over ten thousand court records from periods before and after the implementation of COMPAS in Broward County and uses it to build a Directed Acyclic Graph (DAG) model of the confinement decision-making process. Our approach aims to reveal potential associations between the use of COMPAS and incarceration. We find that the many individuals deemed low risk by COMPAS are much less likely to be confined than were comparable individuals before COMPAS was in use, and similarly, individuals deemed high risk are much more likely to be confined than before. Overall, the impact of COMPAS scores on sentencing decisions is a reduced rate of confinement for both Black and white individuals. However, a racial bias exists within the COMPAS scores, as they are based on historical data that mirrors pre-existing racial inequities. While the overall rate of incarceration decreases, the difference in scores exacerbates the difference in confinement between racial groups, thereby deepening racial disparity. Insofar as criminal risk algorithms can aid decarceration, policymakers and judges alike should be mindful of the potential for increased racial inequity.
July 26, 2024 in Offender Characteristics, Procedure and Proof at Sentencing, Race, Class, and Gender, State Sentencing Guidelines, Technocorrections, Who Sentences | Permalink | Comments (1)
Monday, April 29, 2024
"Electronic Prison: A Just Path to Decarceration"
The title of this post is the title of this new paper authored by Paul Robinson and Jeffrey Seaman now available via SSRN. Here is its abstract:
The decarceration movement enjoys enthusiastic support from many academics and activists who point out imprisonment’s failure to rehabilitate and its potential criminogenic effects. At the same time, many fiscal conservatives and taxpayer groups are critical of imprisonment’s high costs and supportive of finding cheaper alternatives. Yet, despite this widespread support, the decarceration movement has made little real progress at getting offenders out of prison, in large part because community views, and thus political officials, are strongly committed to the importance of doing justice — giving offenders the punishment they deserve — and decarceration is commonly seen as inconsistent with that nonnegotiable principle. Indeed, almost no one in the decarceration movement has attempted to formulate a large-scale decarceration plan that still provides for what the community would see as just punishment.
In this Article, we offer just such a plan by demonstrating that it is entirely possible to avoid the incarceration of most offenders through utilizing non-incarcerative sanctions that can carry a total punitive effect comparable to physical prison. New technologies allow for imposing “electronic prison” sentences where authorities can monitor, control, and punish offenders in a cheaper and less damaging way than physical prison while still doing justice. Further, the monitoring conditions provided in electronic prison allow for the imposition of a wide array of other non-incarcerative sanctions that were previously difficult or impossible to enforce. Even while it justly punishes, electronic prison can dramatically increase an offender’s opportunities for training, treatment, education, and rehabilitation while avoiding the problems of unsupported families, socialization to criminality, and problematic reentry after physical incarceration. And, from a public safety standpoint, electronic prison can reduce recidivism by eliminating the criminogenic effect of incarceration and also provides longer-term monitoring of offenders than an equivalently punitive shorter term of physical imprisonment. Of course, one can imagine a variety of objections to an electronic prison system, ranging from claims it violates an offender’s rights to fears it may widen the net of carceral control. The Article provides a response to each.
Electronic prison is one of those rare policy proposals that should garner support from across the political spectrum due to effectively addressing the complaints against America’s incarceration system lodged by voices on the left, right, and center. Whether one’s primary concern is decarcerating prisoners and providing offenders with needed treatment, training, counseling, and education, or one’s concern is reducing crime, imposing deserved punishment, or simply reducing government expenditures, implementing an electronic prison system would provide a dramatic improvement over America’s current incarceration policies.
April 29, 2024 in Prisons and prisoners, Scope of Imprisonment, Technocorrections | Permalink | Comments (32)
Tuesday, January 30, 2024
Vera Institute produces big new report on "People on Electronic Monitoring"
The Vera Institute today released this lengthy new report, titled simply "People on Electronic Monitoring" and authored by Jess Zhang, Jacob Kang-Brown and Ari Kotler. Here is the "Summary" that begins this 54-page report:
Electronic monitoring (EM) is a form of digital surveillance that tracks people’s physical location, movement, or other markers of behavior (such as blood alcohol level). It is commonly used in the criminal legal system as a condition of pretrial release or post-conviction supervision — including during probation, parole, home confinement, or work release. The United States also uses electronic monitoring for people in civil immigration proceedings who are facing deportation.
This report fills a gap in understanding around the size and scope of EM use in the United States. The Vera Institute of Justice’s (Vera) estimates reveal that, in 2021, 254,700 adults were under some form of EM. Of these, 150,700 people were subjected to EM by the criminal legal system and 103,900 by U.S. Immigration and Customs Enforcement (ICE). Further investigation revealed that the number of adults placed on EM by ICE more than tripled between 2021 and 2022, increasing to 360,000. This means that the total number of adults on EM across both the civil immigration and criminal legal systems likely increased to nearly half a million during that time.
From 2005 to 2021, the number of people on EM in the United States grew nearly fivefold — and almost tenfold by 2022 — while the number of people incarcerated in jails and prisons declined by 16 percent and the number of people held in ICE civil detention increased but not nearly as dramatically as EM. Regional trends in the criminal legal system reveal how EM has been used more widely in some states and cities but increased sharply from 2019 to 2021 across the country: The Midwest has the highest rate of state and local criminal legal system EM, at 65 per 100,000 residents; this rate stayed relatively constant from 2019 to midyear 2021. In the Northeast, EM rates are the lowest of all the regions at 19 per 100,000 residents, but they increased by 46 percent from 2019 to 2021. The South and West have similar rates, 41 and 34 per 100,000 residents respectively, but the growth rate in the South has outpaced that of the West in recent years — up 32 percent in the South compared to 18 percent in the West.
Prior to this report, the most recent estimate of the national EM population was from a 2015 Pew Charitable Trusts study — which studied the use of criminal legal system EM via a survey of the 11 biggest EM companies. For this report, Vera researchers collected data from criminal legal system agencies in all 50 states and more than 500 counties, as well as from federal courts, the Federal Bureau of Prisons, and ICE. Therefore, Vera’s study represents the most comprehensive count of the national EM population to date, as it accounts for the rise of smaller EM companies, immigration system surveillance, and new EM technologies.
For this report, Vera researchers also reviewed existing literature and spoke with local officials to better understand the impacts of EM programs. Vera’s findings contradict private companies’ assertions that EM technology is low-cost, efficient, and reliable. EM in the criminal legal system is highly variable and subject to political decisions at the local level. In many jurisdictions, EM is not used as a means to reduce jail populations. Rather, it is often a crucial component of highly punitive criminal legal systems. This challenges the dominant narrative that EM is an “alternative to incarceration.” Nonetheless, this report also highlights several jurisdictions that demonstrate how decarceration can occur alongside reduced surveillance.
January 30, 2024 in Criminal Sentences Alternatives, Data on sentencing, Detailed sentencing data, Technocorrections | Permalink | Comments (1)
Thursday, January 25, 2024
"The Carceral Home"
The title of this post is the title of this recent paper just posted to SSRN and authored by Kate Weisburd. Here is its abstract:
In virtually all areas of law, the home is the ultimate constitutionally protected area, at least in theory. In practice, a range of modern institutions that target private life — from public housing to child welfare — have turned the home into a routinely surveilled space. Indeed, for the 4.5 million people on criminal court supervision, their home is their prison, or what I call a “carceral home.” Often in the name of decarceration, prison walls are replaced with restrictive rules that govern every aspect of private life and invasive surveillance technology that continuously records intimate information. While prisons have always been treated in the law as sites of punishment and diminished privacy, homes have not. Yet in the carceral home people have little privacy in the place where they presumptively should have the most. If progressive state interventions are to continue, some amount of home surveillance is surely inevitable. But these trends raise a critical, underexplored question: When the home is carceral, what is, or should be, left of the home as a protected area?
This Article addresses that question. Descriptively, it draws on a fifty-state analysis of court supervision rules to reveal the extent of targeted invasions of intimate life in the name of rehabilitation or an alternative to prison, rendering the home a highly surveilled space. Normatively, it argues that allowing this state of affairs with no corresponding adaptations in legal doctrine is untenable. With the home no longer sacred and no limiting principle to take its place, millions of people are left with no meaningful protection from government surveillance, even (or especially) in their home. Left unchecked, the carceral home further entrenches the precise racial, economic, disability, and gender inequities that often inspire reform efforts. Instead, as this Article recommends, privacy and security must be recognized as positive entitlements separate from physical homes.
January 25, 2024 in Procedure and Proof at Sentencing, Reentry and community supervision, Technocorrections | Permalink | Comments (8)
Friday, December 15, 2023
Fascinating new Slate series on "how technology is changing prison as we know it"
This week I came across this notable new series of articles at Slate titled "Time, Online: How technology is changing prison as we know it." The series as of this writing has nine extended pieces by an array of authors covering topics at the intersection of modern technology and modern prison. I have only started reading a few of the piece, and they are all really interesting. This piece by Mia Armstrong-Lopez serves as an introduction under this full title: "For Years, Prison Life Was Isolated From Tech. Now Tech Is Beginning to Define It: Introducing Time, Online, a new Future Tense package that explores how technology is changing prison as we know it." Here is an excerpt from that piece (with links from the original):
Around 1.9 million people are currently incarcerated in the United States, and an estimated 45 percent of Americans have at some point experienced the incarceration of an immediate family member. (For Black Americans, that number increases to 63 percent.) For many years, prisons have largely been tech bunkers, keeping incarcerated people isolated from the world outside. But things have started to change. In some cases, they changed because prison leaders recognized the need to connect incarcerated people to their communities. In other cases, they changed because, in relationship with private companies, prisons found a way to profit.
The pandemic accelerated this trend. Activities like visitation and education programs were paused and in many cases replaced by video calls, e-messaging, and other virtual activities, facilitated in part by glitchy, specially made tablets distributed by private companies. As a result, things like tablets and e-messaging — which may seem trivial to those of us on the outside but can be transformational on the inside — are now popular in prisons and jails across the country. Instead of being isolated from technology, big parts of incarcerated people’s lives are now being mediated through it.
Time, Online, a new package from Future Tense, attempts to document this moment. Through a series of essays that were (with one exception) written by currently and formerly incarcerated people, we’ll examine how tech is changing what it means to be in prison. We’ll explore what happens when streaming comes to prison and how e-messaging is affecting romantic relationships. We’ll look at how incarcerated people are finding ways to jailbreak tablets and how you Google from the inside. We’ll look at what it’s like to wear an ankle monitor and to get access to a laptop after 17 years in prison. We’ll share diaries that document how incarcerated people interact with tech hour by hour, in three different state prison systems, and how much it costs. And we’ll dig into the who’s who of prison tech companies and the powerful financial forces behind them.
December 15, 2023 in Prisons and prisoners, Technocorrections | Permalink | Comments (0)
Friday, November 24, 2023
A mid-holiday weekend round-up of various sentencing opinions
I am taking much of the holiday weekend off from blogging (which may make many thankful). And that provides an excuse to keep up a bit by rounding-up here some opinion pieces that have caught my eye recently:
From Forbes, "Bureau Of Prisons Backtracking On First Step Act Law"
From Governing, "No, Criminal Justice Reform Isn’t Driving Rising Crime"
From the New Republic, "The Turkey Pardon Is a Perfect Emblem of Our Very Dumb Politics"
From the New York Post, "Thank NY’s criminal justice ‘reforms’ for this double murder"
From Rolling Stone, "I’m Serving Life in Prison. It’s a Slow-Motion Death Sentence"
From the Times Union, "Parole reform saved New York money. Invest it in helping people stay out of the criminal justice system."
From the Wall Street Journal, "Tech Can Keep Ex-Offenders Out of Jail"
And a notable pair of commentaries from a notable pair of commentators at Reason:
From Josh Blackman, "A Reversal in Rahimi Will Be Tougher to Write Than Critics Admit"
From Will Baude, "It's Not So Hard to Write an Opinion Following Bruen and Reversing in Rahimi
November 24, 2023 in Prisons and prisoners, State Sentencing Guidelines, Technocorrections, Who Sentences | Permalink | Comments (0)
Sunday, November 05, 2023
"Justice by Algorithm: The Limits of AI in Criminal Sentencing"
The title of this post is the title of this new article authored by Isaac Taylor just published online for the journal Criminal Justice Ethics. Here is its abstract:
Criminal justice systems have traditionally relied heavily on human decision-making, but new technologies are increasingly supplementing the human role in this sector. This paper considers what general limits need to be placed on the use of algorithms in sentencing decisions. It argues that, even once we can build algorithms that equal human decision-making capacities, strict constraints need to be placed on how they are designed and developed. The act of condemnation is a valuable element of criminal sentencing, and using algorithms in sentencing — even in an advisory role — threatens to undermine this value. The paper argues that a principle of “meaningful public control” should be met in all sentencing decisions if they are to retain their condemnatory status. This principle requires that agents who have standing to act on behalf of the wider political community retain moral responsibility for all sentencing decisions. While this principle does not rule out the use of algorithms, it does require limits on how they are constructed.
November 5, 2023 in Procedure and Proof at Sentencing, Purposes of Punishment and Sentencing, Technocorrections, Who Sentences | Permalink | Comments (0)
Tuesday, August 29, 2023
"Quantifying disparate questioning of Black and White jurors in capital jury selection"
The title of this post is the title of this article recently published in the Journal of Empirical Legal Studies and authored by Anna Effenberger, John Blume and Martin Wells. Here is its abstract:
This article presents findings from a quantitative study of jury selection using computational natural language processing methods. We analyzed the voir dire in a set of South Carolina capital trials cases used in previous studies to see if there was evidence of disparate questioning of potential jurors by the prosecution, defense counsel of the trial judge. More specifically, we examined the descriptiveness and complexity of questioning. Our results, presented here, revealed significant, but sometimes subtle, disparate questioning of Black venire persons, especially by the prosecution.
The natural language processing software used in this study could provide attorneys challenging the use of peremptory challenges on appeal as being based on race or gender discrimination with evidence relevant to the issue of disparate questioning, which is often a pretext for purposeful discrimination. It could also potentially be used at trial since the analysis can be conducted almost instantaneously. Using it at either stage of the proceedings could be a powerful tool in achieving the goal of having more diverse juries in criminal cases, especially where the death penalty is a potential punishment.
August 29, 2023 in Procedure and Proof at Sentencing, Race, Class, and Gender, Technocorrections, Who Sentences | Permalink | Comments (2)
Sunday, August 13, 2023
"140 Characters of Justice? The Promise and Perils of Using Social Media to Reveal Lay Punishment Perspectives"
The title of this post is the title of this new paper now available via SSRN authored by Itay Ravid and Rotem Dror. Here is its abstract:
For centuries now, penal theorists have engaged in heated debates about two questions at the heart of criminal legal systems: how can we justify the State’s power to punish individuals, and how can we determine what is the proper level of punishment. To answer these questions, moral philosophers advanced conversations about several theories, which predominantly held in philosophical silos. Over the years, calls to better understand community views regarding justifications of punishment and adopting penal law and policies that align with these views got traction, culminating in Paul Robinson’s “Empirical Desert” theory. Despite its intuitive appeal, there have been criticism of this approach, both questioning its core hypotheses and expressing concerns about its perceived immorality.
At the same time, advancement in social-science methodology provided research tools to empirically deepen our understanding of lay people’s attitudes regarding punishment, mostly through surveys and experimental research designs. One domain, however, remained untouched by those calling to assess lay intuitions of justice: social media. Such oversight is puzzling in light of social media’s potential to reveal public perceptions without scientific intervention.
This Article bridges this gap and engages with two main questions. First, a methodological question: whether social media discourse can be used to reflect laypeople’s attitudes about criminal culpability and punishment, and second, a normative question: should it be used for these purposes.
To answer these questions, the Article first explores current scholarship about the promises and challenges of using social media data to study social perceptions. The Article moves beyond theory, however, and utilizes recent technological developments in the field of Artificial Intelligence (“AI”) and Law and Natural Language Processing (“NLP”) to also explore empirically the potential promise of social media discourse in assessing community views on justice and punishment.
While the findings offer some support for the potentiality of using social media to assess laypeople’s attitudes regarding punishment, they also expose the complex challenges of utilizing such data, particularly for penal law and policy design. First, due to a host of methodological challenges, and second, due to normative challenges, particularly social media’s polarizing nature and the ambiguity around whose voice is amplified through these platforms.
August 13, 2023 in Purposes of Punishment and Sentencing, Technocorrections, Who Sentences | Permalink | Comments (0)
Wednesday, June 21, 2023
"Plea Bargaining in the Virtual Courtroom"
The title of this post is the title of this forthcoming book chapter authored by Thea Johnson now available via SSRN. Here is its abstract:
For many decades, courts across the United States have experimented with administering justice in virtual spaces. But the COVID-19 pandemic accelerated those experiments in some jurisdictions and forced other jurisdictions to adopt new technologies to process cases. While trials, particularly jury trials, were difficult to move into the online sphere, the plea process was easily transferable to the virtual world. Lawyers could negotiate cases via phone, text or video. Judges could see defendants from their homes, or via video from jail. Technologies like Zoom made it easy for the lawyers, clerks and even members of the public, to join the proceedings.
This chapter reflects on the intersection of these virtual experiments and the plea process. It argues that, in some ways, the move to virtual spaces has had very little impact at all on the process. The negotiation of a plea has always been virtual to some degree. Well before the pandemic, lawyers communicated across platforms, often with little input from either defendants or victims, to reach resolutions. To the extent that the loss of the criminal courtroom has had an impact on the interactions between lawyers, it is mostly as a result of them losing those informal spaces — courtroom hallways and back corners — where much of the real-time deal-making occurred. This chapter argues that the shift away from informality might have salutary effects, including increasing the likelihood of the parties creating a record of plea negotiations, and otherwise slowing down what is often a quick and frenzied process. More importantly, observing the virtual plea process helps us challenge assumptions about the importance of the physical space of the courtroom, and the interactions that happen within it. Indeed, an unintended benefit of moving to virtual proceedings has been to release defendants from the punishment of merely attending court. As Malcolm Feeley identified in his work nearly four decades ago, the process of being in the courtroom, separate from any of the formal outcomes associated with the criminal system, is itself a form of punishment. Without courtrooms, defendants are freed from the constraints of a physical space that has often been used to degrade them. As this chapter explores, in the case of plea bargaining, a virtual world may be a world in which defendants can claim more autonomy and respect during the plea process.
June 21, 2023 in Impact of the coronavirus on criminal justice, Procedure and Proof at Sentencing, Technocorrections, Who Sentences | Permalink | Comments (2)
Tuesday, March 28, 2023
Intriguing new PPI report on "unregulated growth of e‑messaging in prisons"
The Prison Policy Initiative has this new new report authored by Mike Wessler on a paerticular technology in prison titled "SMH: The rapid & unregulated growth of e‑messaging in prisons." Here is how it begins (with links from the original):
Over the last twenty years, advocates and regulators have successfully lowered the prices of prison and jail phone rates. While these victories garnered headlines and attention, the companies behind these services quietly regrouped and refocused their efforts. Seeking different ways to protect their profits, they entered less-regulated industries and offered new products to people behind bars. One new service in particular — text-based electronic messaging or “e-messaging” — has experienced explosive and unregulated growth. As a result, rather than living up to its potential as a way to maintain connections between people in prison and the outside world — something that benefits all of us — high costs and shoddy technology have made e-messaging little more than the latest way these companies drain money from incarcerated people and their loved ones.
In 2016, we released a groundbreaking report that took a first look at e-messaging, sometimes — but incorrectly — called “email.” At that time, the technology was experimental, untested, and viewed skeptically by many correctional administrators. Since then, though, it has become common inside prison walls.
To better understand this explosive growth in e-messaging, we examined all 50 state prison systems, as well as the Federal Bureau of Prisons (BOP), to see how common this technology has become, how much it costs, and what, if anything, is being done to protect incarcerated people and their families from exploitation. We found an industry that is in flux, expanding quickly, and has yet to face the legislative and regulatory oversight it desperately needs.
March 28, 2023 in Prisons and prisoners, Technocorrections, Who Sentences | Permalink | Comments (0)
Tuesday, January 24, 2023
"Glass Box Artificial Intelligence in Criminal Justice"
The title of this post is the title of this notable new paper available via SSRN authored by Brandon Garrett and Cynthia Rudin. Here is its abstract:
As we embrace data-driven technologies across a wide range of human activities, policymakers and researchers increasingly sound alarms regarding the dangers posed by “black box” uses of artificial intelligence (AI) to society, democracy, and individual rights. Such models are either too complex for people to understand or they are designed so that their functioning is inaccessible. This lack of transparency can have harmful consequences for the people affected. One central area of concern has been the criminal justice system, in which life, liberty, and public safety can be at stake. Judges have struggled with government claims that AI, such as that used in DNA mixture interpretation, risk assessments, facial recognition, and predictive policing, should remain a black box that is not disclosed to the defense and in court. Both the champions and critics of AI have argued we face a central trade-off: black box AI sacrifices interpretability for predictive accuracy.
We write to counter this black box myth. We describe a body of computer science research showing “glass box” AI that is interpretable can be more accurate. Indeed, criminal justice data is notoriously error prone, and unless AI is interpretable, those errors can have grave hidden consequences. Our intervention has implications for constitutional criminal procedure rights. Judges have been reluctant to impair perceived effectiveness of black box AI by insisting on the disclosures defendants should be constitutionally entitled to receive. Given the criminal procedure rights and public safety interests at stake, it is especially important that people can understand AI. More fundamentally, we argue that there is no necessary tradeoff between the benefits of AI and the vindication of constitutional rights. Indeed, glass box AI can better accomplish both fairness and public safety goals.
January 24, 2023 in Procedure and Proof at Sentencing, Technocorrections | Permalink | Comments (0)
Thursday, December 01, 2022
"The Progressive Case for Ankle Bracelets"
The title of this post is the headline of this notable new Newsweek opinion piece authored by Barry Latzer. I recommend the piece in full, and here are excerpts:
Many of the most progressive countries in the world are making use of technology to promote rehabilitation and reduce incarceration. Yet blue states like Massachusetts and left-leaning advocacy organizations remain hostile to use of electronic monitoring (EM) methods. They are overlooking the benefits of EM — even from a progressive standpoint.
Progressives' typically formulated criminal justice goal is laudatory: to minimize incarceration consistent with public safety, and to maximize the rehabilitation of offenders. But achieving these ends has been, to say the least, problematic.
Progressives commonly urge more addiction treatment and mental health services as steps toward rehabilitation. These treatments might be beneficial for many ex-offenders, but by themselves they are unlikely to sharply curtail recidivism. Vocational training is also useful, but success measured by societal reintegration of ex-offenders is unproven. Despite all that we've learned about rehabilitation over the last five decades, the inescapable fact is that over 80% of all prisoners are rearrested for new crimes at some point after they are released.
Virtually all prisoners return to free society — and more quickly than most people realize. Only 20% of prisoners complete full sentences, and the median time actually served is a mere one year and four months. Released offenders are then monitored by parole or probation officers, who are supposed to encourage constructive behavior. But as we all know, these officers have enormous caseloads and cannot effectively supervise the volume of people they are assigned. Under the current parole and probation system, there are, as a practical matter, few disincentives to crime, which is why so many released offenders are repeaters.
Each year, tens of thousands of probationers and parolees fail to comply with the terms of their release and are sent back to jail or prison. In 2019, before COVID produced its own distinct brand of decarceration, 334,000 probation and parole failures were (re)incarcerated, which constituted 29% of all prison admissions that year.
Given these discouraging realities, the benefits of EM from a progressive standpoint surely are worth reconsidering.
1. EM helps ex-offenders avoid incarceration and reintegrate into free society....
2. EM can effectively replace incarceration....
3. EM protects crime victims, especially the most vulnerable....
Before we reject this useful tool to help ex-offenders turn their lives around and avoid wasted years behind bars, we should ask ourselves this question: Is there a better way to achieve reintegration into law-abiding society, while also taking public protection into account? I submit there is not.
December 1, 2022 in Criminal Sentences Alternatives, Reentry and community supervision, Technocorrections | Permalink | Comments (1)
Tuesday, March 15, 2022
Spotlighting the new widening potential of electronic monitoring
This new Los Angeles Times op-ed authored by Kate Weisburd and Alicia Virani and headlined "The monster of incarceration quietly expands through ankle monitors," spotlights why many are concerned that electronic monitoring and other new supervision tool may expanded rather than reduce our nation's carceral footprint. I recommend the full piece, and here are excerpts (with links from the original):
In Los Angeles County, the number of people ordered to wear electronic ankle monitors as a condition of pretrial release went up 5,250% in the last six years, according to a recent report by the UCLA Criminal Justice Program. The figure rose from just 24 individuals in 2015 to more than 1,200 in 2021. This type of carceral surveillance is becoming the “new normal” across the U.S....
It’s widely defended as “better than jail,” but being “better than jail” does not make a criminal justice policy sound — much less humane or legal.... It’s deceptive to even compare jail and ankle monitors as though they are the only two options. There is a third option: freedom. In 2015 and before, L.A. judges were unlikely to order electronic monitoring as a condition of release before trial. Judges either set bail, released people on their own recognizance or ordered that people be detained in jail until trial.
Now, judges seem to be defaulting to electronic monitoring, perhaps for people who would — or should — otherwise be free. For people who would otherwise be in jail, monitoring may be preferable. But for people who are monitored instead of being released on their own recognizance, monitoring reflects a dangerous expansion of the carceral state.
This “E-jail” entails a web of invasive rules and surveillance technologies, such as GPS-equipped ankle monitors, that allow law enforcement to tag, track and analyze the precise locations of people who have not been convicted of any crime.... The difference between E-jails and real jails is a matter of degree, not of kind. A recent report by researchers at George Washington University School of Law details the myriad ways that monitoring undermines autonomy, dignity, privacy, financial security and social relationships when they are needed most.
Like in jail, people on monitors lose their liberty. In L.A., as elsewhere, people on monitors are forbidden from leaving their house without pre-approval from authorities days in advance. Like in jail, people on monitors have little privacy and must comply with dozens of strict rules governing every aspect of daily life. Failing to charge the monitoring device, changing a work or school schedule without permission or making an unauthorized trip to the grocery store can land someone back in jail for a technical violation. It is hardly surprising that in L.A. County, technical rule violations, not new criminal offenses, led to more than 90% of the terminations and reincarcerations applied to people on electronic monitors.
Ankle monitoring also further entrenches the very racial and economic inequities that bail reform sought to address. In 2021, 84% of people on pretrial electronic monitoring in L.A. County were either Black or Latinx. And in most places, though not in L.A., people on ankle monitors before trial are required to pay for the device. These fees are on top of other costs, such as electric bills (to charge the monitor), cellphone bills (to communicate with the monitoring agency) and the cost of care and transportation for family members that is required because people on monitors often cannot leave home....
There is, however, some reason for optimism. After years of community organizing, L.A. County’s Board of Supervisors recently passed a motion to develop an independent pretrial services agency within a new Justice, Care and Opportunities Department that takes over the role that probation plays in pretrial services. This new agency has the ability and authority to end the county’s needless reliance on electronic monitoring. We urge officials to focus on innovative solutions that rely on community-based support rather than punitive and harmful surveillance technology.
March 15, 2022 in Criminal Sentences Alternatives, Reentry and community supervision, Technocorrections, Who Sentences | Permalink | Comments (8)
Thursday, September 23, 2021
Notable new report spotlights onerous nature of electronic monitoring in US
This new NBC News piece, headlined "Other than prison, electronic monitoring is 'the most restrictive form' of control, research finds" report on this interesting new report from folks at George Washington University Law School, titled "Electronic Prisons: The Operation of Ankle Monitoring in the Criminal Legal System." Here are excerpts from the press piece:
In the past 18 months, as the judicial system has increasingly used electronic monitoring instead of prisons to monitor inmates through the coronavirus pandemic, newly released data confirm what activists and advocates have long argued: Ankle monitors are onerous, and they often subject wearers to vague rules, like avoiding people of “disreputable character.” The ankle monitoring business, the research found, is also dominated by four profit-seeking companies, and it ultimately could drive more people back to prison.
The new, comprehensive collection of hundreds of electronic monitoring-related rules, policies and contracts, obtained through public records requests across 44 states, demonstrates that four companies that make millions of dollars a year account for 64 percent of the contracts examined in the study. The companies — Attenti, BI Inc., Satellite Tracking of People LLC and Sentinel Offender Services LLC, according to the report — also keep location data indefinitely, even after monitoring is completed, which is within the law. Governments also often require family members or employers to act as agents of the government and report potential violations, putting them in an awkward position in which they must be both supportive and supervisory.
Crucially, wearers must pay both one-time and ongoing fees for the monitors, which can be $25 to over $8,000 a year. The report argues that such costs “undermine financial security when it is needed most.” By comparison, the Justice Department’s Bureau of Prisons said in 2018 that it costs just under $100 per day to incarcerate a federal inmate, or over $36,000 a year....
“This is a form of incarceration that happens outside of prison walls,” said Kate Weisburd, an associate professor of law at George Washington University, who led a team of 10 law students that filed and analyzed the trove of documents . “It’s always intended to be a positive alternative to incarceration. But based on what we found, it’s doing the opposite. More rules and more surveillance generally leads to higher incarceration.”...
Put another way, people on monitors are subject to a vast number of government rules, which “makes compliance difficult,” according to the report. Some of the rules are quite vague. For example, the Alabama Bureau of Pardons and Parole mandates that wearers “shall abandon evil associates and ways,” while the New Mexico Corrections Department says parolees must “maintain acceptable behavior.”...
Weisburd’s research found that because the results are open to interpretation and wearers can be hit with “technical violations” of the rules, “people are more likely to be reincarcerated for minor infractions that previously would have been invisible and ignored.” In most cases, electronic monitoring is coupled with a form of house arrest — wearers must stay at or near their homes for a certain amount of time. They cannot leave without permission in advance. But according to the policies and contracts that Weisburd and her team obtained, most agencies do not clearly explain how far in advance such permission must be sought. “Basically, every record we looked at had a negative impact, and by every measure it undermines people’s ability to survive outside of prison,” she said. “Just having to comply with the sheer number of rules, vague and broad rules, it means people are getting dinged more easily.”...
The most recent data from the Pew Charitable Trust, released in 2016, found that about 131,000 people were on monitors during a single day. Weisburd and her team say in the report that “it is likely that the numbers are higher considering the pressure to release people from incarceration because of the pandemic.”... The frequency with which such monitoring is assigned varies wildly across the country. For example, Weisburd’s research shows that over 11,000 people who are on probation are also on monitors in Marion County, Indiana, alone, while the entire state of Florida has less than half that number, at just over 5,400.
Here is the introduction of the 54-page report:
The use of surveillance technology to tag and track people on pretrial release, probation and parole is on the rise. The COVID-19 crisis in prisons and jails, bail reform efforts and bipartisan support for curbing mass incarceration accelerated interest in purported alternatives to incarceration. As a result, the use electronic monitoring devices, including GPS-equipped ankle monitors, went up dramatically.
Thanks to the leadership of community organizers and advocates, the harmful and racialized nature of this type of carceral surveillance has been exposed. This report seeks to add to those efforts by examining the specific policies, procedures, contracts and rules that govern the use of electronic monitoring of people on probation, parole and pretrial release. Drawing on over 247 records from 101 agencies across 44 states and the District of Columbia, this report focuses on the operation of electronic monitoring and reveals the degree to which monitoring impacts all aspects of everyday life and undermines the ability of people to survive and thrive. In particular, this report focuses on the specific rules and policies governing people on monitors and how they restrict movement, limit privacy, undermine family and social relationships, jeopardize financial security and result in repeated loss of freedom. Unlike traditional models of probation and parole, electronic surveillance is more intensive, restrictive and dependent on private surveillance companies that are driven by profit motive. The findings in this report demonstrate what advocates have long said: Electronic surveillance is not an alternative to incarceration, it’s an alternative form of incarceration. And like incarceration, the deprivations and restrictions of electronic monitoring further entrench race and class-based subordination.
September 23, 2021 in Criminal Sentences Alternatives, Data on sentencing, Race, Class, and Gender, Reentry and community supervision, Technocorrections | Permalink | Comments (1)
Tuesday, September 14, 2021
NACDL produces notable new report on data-driven policing and racial bias in criminal justice system
As detailed in this new press release, "today, the National Association of Criminal Defense Lawyers (NACDL) released its latest report – Garbage In, Gospel Out: How Data-Driven Policing Technologies Entrench Historic Racism and ‘Tech-wash’ Bias in the Criminal Legal System." Here is more from the release:
As explained in the report, in recent years, police departments have been turning to and relying on rapidly developing data-driven policing technologies to surveil communities, track individuals and, purportedly, predict crime. These technologies include algorithmic decision-making that departments claim can predict where crime is likely to occur, who will likely commit crime, and who will likely be a victim. These algorithms are thus designed to interrogate massive troves of data gathered in a myriad of ways, using inputs that can range from police-generated crime reports to publicly available social media posts. The outputs are then used to make critical decisions about patrols, or to make life-altering designations of individuals.
The purpose of this Report is to: (1) call attention to the rapid development and deployment of data-driven policing; (2) situate data-driven policing within the racialized historical context of policing and the criminal legal system; (3) make actionable recommendations that respond to the reality, enormity, and impact of data-driven policing; and (4) suggest strategies for defense lawyers in places where data-driven policing technology is employed.
“This Report will contribute profoundly to the national conversation regarding the inhumane, unfair, and destructive impact of racism and bias in policing,” said NACDL President Martín Antonio Sabelli. “As the title of the Report suggests, data-driven policing technologies amplify the effects of systemic racism in policing by collecting data based on racist policing (including, for example, overpolicing of communities of color) and treating that garbage data as gospel for future policing decisions. ‘White-washing’ this biased data does nothing more than give a veneer of respectability and an appearance of neutrality while entrenching problematic practices rooted in racism. The report calls for the abandonment of data-driven policing, wherever possible, and transparency and accountability where such practices have already become entrenched.”
“For more than two years, NACDL’s Task Force on Predictive Policing conducted research and interviews across the nation, leading to this report and the recommendations and suggested strategies set forth in it,” explained NACDL Task Force on Predictive Policing Chair Cynthia W. Roseberry. “This report works to demystify the practice of data-driven policing to ensure that those engaged in the essential work of combatting systemic racism in the criminal legal system can operate with full information. This report is not only an important addition to the body of scholarship in this area, it will also serve as a vital tool for advocates and defenders alike.”
The Report’s major topics include (1) the history of policing and the economics of punishment, (2) the history of surveillance and the rise of big data, (3) the landscape of data-driven policing, (4) critical analysis of data-driven policing, (5) task force recommendations on data-driven policing technologies, (6) an overview of state and local legislation, (7) an overview of police departments that have suspended or terminated contracts with data-driven policing programs, and more.
The full 100+-page report is available at this link.
September 14, 2021 in Procedure and Proof at Sentencing, Race, Class, and Gender, Technocorrections, Who Sentences | Permalink | Comments (0)
Tuesday, July 27, 2021
"Just Algorithms: Using Science to Reduce Incarceration and Inform a Jurisprudence of Risk"
The title of this notable new forthcoming book authored by Christopher Slobogin. It is also the title of this new SSRN posting which provides a preview of the book and an article that sets forth some of its contents. Here is the SSRN posting abstract:
Statistically-derived algorithms, adopted by many jurisdictions in an effort to identify the risk of reoffending posed by criminal defendants, have been lambasted as racist, de-humanizing, and antithetical to the foundational tenets of criminal justice. Just Algorithms argues that these attacks are misguided and that, properly regulated, risk assessment tools can be a crucial means of safely and humanely dismantling our massive jail and prison complex.
The book explains how risk algorithms work, the types of legal questions they should answer, and the criteria for judging whether they do so in a way that minimizes bias and respects human dignity. It also shows how risk assessment instruments can provide leverage for curtailing draconian prison sentences and the plea-bargaining system that produces them. The ultimate goal of the book is to develop the principles that should govern, in both the pretrial and sentencing settings, the criminal justice system's consideration of risk. Table of Contents and Preface are provided, as well as a recent article that tracks closely two of the book's chapters.
July 27, 2021 in Procedure and Proof at Sentencing, Purposes of Punishment and Sentencing, Recommended reading, Technocorrections, Who Sentences | Permalink | Comments (0)
Monday, July 19, 2021
NIJ releases new publication with "Guidelines for Post-Sentencing Risk Assessment"
Via this webpage, headed "Redesigning Risk and Need Assessment in Corrections," the National Institute of Justice discusses its notable new publication titled "Guidelines for Post-Sentencing Risk Assessment." Here is how the webpage sets up the full publication:
Over the past several decades, the use of RNA in correctional systems has proliferated. Indeed, the vast majority of local, state, and federal correctional systems in the United States now use some type of RNA. Despite the numerous ways in which RNA instruments can improve correctional policy and practice, the kind of RNA currently used across much of the country has yet to live up to this promise because it is outdated, inefficient, and less effective than it should be.
In an effort to help the corrections field realize the full potential of RNA instruments, NIJ recently released Guidelines for Post-Sentencing Risk Assessment. These guidelines, assembled by a trio of corrections researchers and practitioners, are built around four fundamental principles for the responsible and ethical use of RNAs: fairness, efficiency, effectiveness, and communication. Each of these principles contributes to an innovative, practical checklist of steps practitioners can use to maximize the reliability and validity of RNA instruments.
Here is part of the executive summary from the full report:
Risk and needs assessment (RNA) tools are used within corrections to prospectively identify those who have a greater risk of offending, violating laws or rules of prison or jail, and/ or violating the conditions of community supervision. Correctional authorities use RNA instruments to guide a host of decisions that are, to a large extent, intended to enhance public safety and make better use of scarce resources. Despite the numerous ways in which RNA instruments can improve correctional policy and practice, the style and type of RNA currently used by much of the field has yet to live up to this promise because it is outdated, inefficient, and less effective than it should be.
In an effort to help the corrections field realize the potential that RNA instruments have for improving decision-making and reducing recidivism, we have drawn upon our collective wisdom and experience to identify four principles that are critical to the responsible and ethical use of RNAs. Within each principle is a set of guidelines that, when applied in practice, would help maximize the reliability and validity of RNA instruments. Because these guidelines comprise novel, evidence-based practices and procedures, the recommendations we propose in this paper are relatively innovative, at least for the field of corrections.
■ The first principle, fairness, holds that RNA tools should be used to yield more equitable outcomes. When assessments are designed, efforts should be taken to eliminate or minimize potential sources of bias, which will mitigate racial and ethnic disparities. Preprocessing, in-processing, and post-processing adjustments are design strategies that can help minimize bias. Disparities can also be reduced through the way in which practitioners use RNAs, such as delivering more programming resources to those who need it the most (the risk principle). Collectively, this provides correctional agencies with a strategy for achieving better and more equitable outcomes.
■ The second principle, efficiency, indicates that RNA instruments should rely on processes that promote reliability, expand assessment capacity, and do not burden staff resources. The vast majority of RNAs rely on time-consuming, cumbersome processes that mimic paper and pencil instruments; that is, they are forms to be completed and then manually scored by staff. The efficiency of RNA tools can be improved by adopting automated and computer-assisted scoring processes to increase reliability, validity, and assessment capacity. If RNA tools must be scored manually, then inter-rater reliability assessments must be carried out to ensure adequate consistency in scoring among staff.
■ RNA instruments should not only be fair and efficient, but they should also be effective, which is the third key principle. The degree to which RNA instruments are effective depends largely on their predictive validity and how the tool is used within an agency. Machine learning algorithms often help increase predictive accuracy, although developers should test multiple algorithms to determine which one performs the best. RNA tools that are customized to the correctional population on which they are used will deliver better predictive performance.
■ Finally, it is important to focus on the implementation and use of RNAs so that individuals can become increasingly aware of their risk factors. To this end, the fourth key principle is to employ strategies that improve risk communication. Training the correctional staff who will be using the RNA tool is essential for effective communication, particularly in how to explain the needs and translate it into a case plan. A risk communication system, which includes case plan improvement, treatment-matching algorithms, and graduated sanctions and incentives, provides an integrated model for decision-making that helps increase an individual’s awareness of their own circumstances and need for programming.
July 19, 2021 in Prisons and prisoners, Procedure and Proof at Sentencing, Purposes of Punishment and Sentencing, Reentry and community supervision, Technocorrections, Who Sentences | Permalink | Comments (0)
Monday, July 05, 2021
Asking hard (but incomplete) questions about electronic monitoring as an alternative to prison
I am glad to see that NBC News has this lengthy new article about electronic monitoring under the headline "Incarcerated at home: The rise of ankle monitors and house arrest during the pandemic." Unfortunately, the piece only scratches the surface concerning how the pandemic may have enduringly altered sentencing practices and the pros and cons of greater reliance on home confinement with electronic monitoring. I still recommend this piece, but I hope future coverage will gather more data and dig even deeper into pandemic-era experiences on this important topic. In the meantime, here are excerpts from this lengthy NBC News piece:
During the pandemic, as jails raced to release incarcerated people because prisons became coronavirus hot spots, many judges nationwide responded by putting those who were being released in electronic ankle monitors that tracked their movements 24 hours a day. Other people were assigned ankle monitors as an alternative to bail as they awaited trial in a backlogged court system that moved online.
Now, early data shows how much the use of electronic ankle monitoring rose nationwide during that time, according to research from Kate Weisburd, a law professor at George Washington University and a former juvenile defender. Researchers are finding that ankle monitors are keeping people connected to the prison system longer than ever, as more remain strapped to the devices for over a year.
“Everyone is looking for ways of getting people out of custody, which obviously is a good thing,” Weisburd said. “But what's happening in some jurisdictions in the adult system is that more and more people are being released on monitors as a response to decarceration.”
In Chicago, the Cook County Sheriff Office's use of ankle monitors for adults who are awaiting trial jumped from 2,600 people in April last year to over 3,500 in December, according to data from the Chicago Appleseed Center for Fair Courts, a research and civil liberties group that advocates to improve court processes and find alternatives to incarceration. Chief Adriana Morales of the sheriff’s office said in a statement that electronic monitoring is always court-ordered and confirmed that during Covid-19 there’s been a “dramatic increase” in orders for them.
Law enforcement departments that use electronic monitoring say the devices are supposed to serve as an alternative to incarceration and help people remain in their community rather than serving time in jail. But interviews with people who have been incarcerated and then placed on ankle monitors and researchers who study recidivism say the surveillance devices hurt people trying to get their life on track after prison and that there’s no evidence the technology is rehabilitative. They often drag adults and youth even deeper into the criminal justice system and sometimes back behind bars....
Law enforcement experts find that ankle monitors seem to work best for a targeted population, like adults who are found to be at high risk to reoffend, said Kelly Mitchell, executive director of the Robina Institute of Criminal Law and Criminal Justice at the University of Minnesota. “But for your average drug and property offenses, it’s not a good use at all.”
Mitchell said electronic monitoring can be helpful from a probation officer’s perspective when keeping track of individuals who have committed more serious offenses or violent crimes but would still benefit from being taken out of the jail system. “Electronic monitoring can provide a little bit of extra something to monitor that person for a period of time if we decide that we’re ready to give them a chance in the community,” Mitchell said.
Ankle monitors were first developed by social psychologists in the 1960s in an effort to offer positive reinforcement to juvenile offenders. They came into use by the justice system in the 1980s and early 1990s. While they still offer the upside of an alternative to prison or jail, they have in recent years become the focus of growing skepticism — particularly as their use has widened. Advocates for criminal justice reform say that while ankle monitors may appear preferable for people who hope to get out of jail sooner, they don’t address systemic issues that land so many people behind bars.
“We're not putting resources into their communities to address the issues of violence, to address the issues of unemployment and poverty and structural racism,” said James Kilgore, an author and activist with the Challenging E-Carceration project at the Center for Media Justice. “Instead we’re going to slap this thing on them so we can track them, and we can keep them locked up in their house.”...
Though electronic monitoring is cheaper for municipalities and states than jail, the cost of the surveillance device is often passed on to the people wearing them. And during the pandemic, when millions of people lost their jobs and unemployment benefits were backlogged, that cost added up. In at least 30 states, agencies require those who are placed in an electronic monitor to pay between $2 and $20 a day to wear one, not including activation fees that some counties tack on, according to Weisburd’s research....
While the cost of incarceration is higher than the cost of an ankle monitor and being on house arrest for many is a better option than being in jail, in places like Chicago, the majority of people who are on electronic monitoring are awaiting trial and have yet to be convicted. But unlike other jurisdictions, Cook County does not charge offenders....
Like so many electronics, ankle monitors also don’t always work. When the electronic monitor senses a violation, whether from not being charged at the right time or when someone steps outside their house at the wrong time, the company running the monitor notifies law enforcement. Then officers may be sent to the wearer’s home or work.
With the dramatic increase of people on ankle monitors during the pandemic in Chicago, local watchdogs say they’re seeing a rise in violations for small infractions. Matthew McLoughlin, an organizer with the Illinois Network for Pretrial Justice, said he’s also seen an increase in more false violations and technical glitches for people whose ankle monitors rely on GPS tracking.
July 5, 2021 in Criminal Sentences Alternatives, Technocorrections | Permalink | Comments (1)
Saturday, June 12, 2021
"Progressive Algorithms"
The title of this post is the title of this notable new paper authored by Itay Ravid and Amit Haim available via SSRN. Here is its abstract:
Our criminal justice system is broken. Problems of mass incarceration, racial disparities, and susceptibility to error are prevalent in all phases of the criminal process. Recently, two dominant trends that aspire to tackle these fundamental problems have emerged in the criminal justice system: progressive prosecution — often defined as elected reform-minded prosecutors that advance systemic change in criminal justice — and algorithmic decision-making — characterized by the adoption of statistical modeling and computational methodology to predict outcomes in criminal contexts.
While there are growing bodies of literature on each of these two trends, thus far they have not been discussed in tandem. This Article argues that scholarship on criminal justice reform must consider both developments and strive to reconcile them. We argue that while both trends promise to address similar key flaws in the criminal justice system, they send diametrically opposed messages with respect to the role of humans in advancing criminal justice reform. Progressive prosecution posits that humans are the solution, while algorithmic tools suggest humans are the problem. This clash reflects both normative frictions and deep differences in the modus operandi of each of these paradigms. Such tensions are not only theoretical but have immediate practical implications such that each approach tends to inhibit the advantages of the other with respect to bettering the criminal justice system.
We argue against disjointly embedding progressive agendas and algorithmic tools in criminal justice systems. Instead, we offer a decision-making model which prioritizes principles of accountability, transparency, and democratization without neglecting the benefits of computational methods and technology. Overall, this article offers a framework to start thinking through the inherent frictions between progressive prosecution and algorithmic decision-making and the potential ways to overcome them. More broadly, the Article contributes to the discussions about the role of humans in advancing legal reforms in an era of pervading technology.
June 12, 2021 in Procedure and Proof at Sentencing, Purposes of Punishment and Sentencing, Technocorrections, Who Sentences | Permalink | Comments (0)
Sunday, May 16, 2021
"Bars Behind Bars: Digital Technology in the Prison System"
The title of this post is the title of this notable new paper now available vis SSRN authored by Paolo Arguelles and Isabelle Ortiz-Luis. Here is its abstract:
With little opportunity to engage with technology while behind bars, returning citizens are finding themselves on the far side of the digital divide and increasingly vulnerable to recidivism. Investing in a well-run digital literacy program for our prison system is an innovative solution to a persistent problem and a rare win-win situation for inmates, prison officials, and American taxpayers.
We begin by discussing how inmate tablet distribution programs mutually benefit both inmates and prison officials. We then explore prison profiteering by technology companies as a potential obstacle to the successful administration of technology programs, discussing the emergence of virtual monopolies in the prison technology space, their history of controversial pricing practices, and how these practices are perpetuated through prison tablet programs. We then present novel insights into how competitive bidding can be used as a public policy instrument to regulate competition, specifically in the context of prison technology. We argue that a traditional bidding framework is insufficient to act as a policy instrument and propose an alternative incentive-based framework toward this end. We conclude by outlining several practical recommendations that prison officials should consider when administering digital literacy programs in their facilities.
May 16, 2021 in Prisons and prisoners, Technocorrections, Who Sentences | Permalink | Comments (1)
Thursday, March 25, 2021
"Punitive Surveillance"
The title of this post is the title of this notable new paper by Kate Weisburd now available via SSRN. Here is its abstract:
Is there a “punishment exception” to the Constitution? That is, can the deprivation of fundamental rights — such as the right to protest, to visit a mosque, or consult a lawyer — be imposed as direct punishment for a crime, so long as such intrusions are not “cruel and unusual” (under the Eighth Amendment)? On the one hand, such intrusions seem clearly unconstitutional unless narrowly tailored to meet a compelling state interest; on the other hand, they seem less harsh than prison. Surprisingly, the answer is not obvious. But the answer is critical as courts increasingly impose new forms of non-carceral punishment, such as GPS-equipped ankle monitors, smart phone tracking, and suspicionless searches of electronic devices. This type of monitoring, what I term “punitive surveillance,” allows government officials and for-profit companies to track, record, search and analyze the location, biometric data and other meta-data of thousands of people on probation and parole. With virtually no oversight or restraint, punitive surveillance strips people of fundamental rights, including privacy, speech, and liberty. Thus far, courts have assumed that such intrusions are merely “conditions” of punishment or “regulatory” measures. As a result, punitive surveillance is subject to almost no limitations.
This Article is the first to argue that these restrictive and invasive surveillance measures are — just like a prison sentence — punishment, and subject to constitutional limits. The Article makes three contributions. First, drawing on original empirical research of almost 300 state and local policies, it reveals the punitive and rights-stripping nature of electronic surveillance of those on court supervision. Second, it explains why courts’ labeling of such surveillance as a “condition” of punishment or a regulatory measure stems from a misunderstanding of this surveillance and the law of punishment. Finally, it makes the case that punishment is still subject to constitutional limits beyond the Eighth Amendment and the Ex Post Facto Clause, as well as other limits. Given the rights at stake, and that punitive surveillance entrenches race and class-based subordination, limiting punitive surveillance is crucial.
March 25, 2021 in Criminal Sentences Alternatives, Procedure and Proof at Sentencing, Purposes of Punishment and Sentencing, Reentry and community supervision, Technocorrections | Permalink | Comments (0)
Wednesday, March 10, 2021
"The Intersection of Race and Algorithmic Tools in the Criminal Legal System"
The title of this post is the title of this new article authored by Vincent Southerland now available via SSRN. H ere is its abstract:
A growing portion of the American public — including policymakers, advocates, and institutional stakeholders — have generally come to accept the fact that racism endemic to the United States infects every stage of the criminal legal system. Acceptance of that fact has led to efforts to address and remedy pervasive and readily observable systemic bias. Chief among those efforts is a turn toward technology — specifically algorithmic decision-making and actuarial tools. Many have welcomed the embrace of technology, confident that technological tools can solve a problem — race-based inequity — that has bedeviled humans for generations.
This article engages that embrace by probing the adoption of technological tools at various sites throughout the criminal legal system and exploring their efficacy as a remedy to racial inequality. Then, by applying a racial justice lens, this article develops and offers a set of prescriptions designed to address the design, implementation, and oversight of algorithmic tools in spaces where the promise offered by technological tools has not been met. Adherence to that lens may draw us closer to what this article terms a pragmatic abolitionist ethos regarding the use of technological tools in the criminal legal system. Such an ethos does not mean the immediate absence of a criminal legal system altogether. It instead means a criminal system that ultimately operates in ways dramatically different from the current regime by divesting from incarceration and investing in community well-being, human welfare, and rehabilitation.
March 10, 2021 in Procedure and Proof at Sentencing, Race, Class, and Gender, Technocorrections | Permalink | Comments (0)
Saturday, February 27, 2021
"Reprogramming Recidivism: The First Step Act and Algorithmic Prediction of Risk"
The title of this post is the title of this paper recently posted to SSRN and authored by Amy Cyphert. Here is its abstract:
The First Step Act, a seemingly miraculous bipartisan criminal justice reform bill, was signed into law in late 2018. The Act directed the Attorney General to develop a risk and needs assessment tool that would effectively determine who would be eligible for early release based on an algorithmic prediction of recidivism. The resulting tool — PATTERN — was released in the summer of 2019 and quickly updated in January of 2020. It was immediately put to use in an unexpected manner, helping to determine who was eligible for early release during the COVID-19 pandemic. It is now the latest in a growing list of algorithmic recidivism prediction tools, tools that first came to mainstream notice with critical reporting about the COMPAS sentencing algorithm.
This Article evaluates PATTERN, both in its development as well as its still-evolving implementation. In some ways, the PATTERN algorithm represents tentative steps in the right direction on issues like transparency, public input, and use of dynamic factors. But PATTERN, like many algorithmic decision-making tools, will have a disproportionate impact on Black inmates; it provides fewer opportunities for inmates to reduce their risk score than it claims and is still shrouded in some secrecy due to the government’s decision to dismiss repeated calls to release more information about it. Perhaps most perplexing, it is unclear whether the tool actually advances accuracy with its predictions. This Article concludes that PATTERN is a decent first step, but it still has a long way to go before it is truly reformative.
February 27, 2021 in FIRST STEP Act and its implementation, Procedure and Proof at Sentencing, Technocorrections | Permalink | Comments (1)
Friday, February 19, 2021
"Should Public Defenders Be Tweeting?"
The question in the title of this post is the headline of this notable new Vice article. I recommend the lengthy piece in full, and I suspect more than a few readers might have more than a few thoughts on this important modern-day topic. Here are just a few excerpts from a piece with lots of thought-provoking elements:
Public defenders had blogged about their work as long as a decade ago, and tweeting about arraignments wasn’t new, but [Scott] Hechinger and others in New York’s PD scene are responsible for popularizing the trend. As it’s grown, however, criminal justice reform advocates and formerly incarcerated people have started to argue that these posts can put clients at risk of retaliation from judges and prosecutors, violate their privacy, and present ethical quandaries for public defenders talking so openly about their work on Twitter. The optics of white public defenders gaining likes or retweets on stories of Black and brown suffering has also been called into question. As advocacy efforts morph from live-tweets to slick video productions, and gain traction with a public increasingly likely to support justice reform, the question has become: who should be telling the story?...
They are a powerful voice in the justice system, but one fear for public defenders and defendants alike is that the judge, prosecutor, or parole officer will retaliate against tweets that are critical of their actions, said Qiana Johnson, executive director of Life After Release, a program that assists people with re-entry. Even those who have already served a sentence are often constrained from speaking out by probation or parole conditions. “Their advocacy could cost them,” she said....
There have been complaints filed against defense attorneys over their use of social media in recent years, though those are not public, said Ellen C. Yaroshefsky, the Howard Lichtenstein Distinguished Professor of Legal Ethics at Hofstra University, who studies ethics in law. Prior to 2014, a Virginia attorney was disciplined for blogging about the cases he had won as a defense attorney, an act that constituted self-advertising, the state bar found, according to the 2013 law letter written by Nicole Hyland as part of an ethics committee. An Illinois public defender was fired from her job and suspended by the bar for 60 days after posting client details in 2007 and 2008 on Facebook, including a picture of her client’s leopard-skin underwear — evidence in a trial. Others have been disciplined for disparaging clients or judges. In March 2018, the American Bar Association issued a formal opinion limiting the ability of attorneys to blog or comment publicly about their cases.
Many on PD Twitter have also been called out for “trading on the suffering of Black and brown people,” said [Nicole] Smith Futrell, cautioning that “just because you’re a public defender representing someone who’s experienced [the system,] it doesn’t mean that you’ve experienced that thing that you now get to tell.” As defense attorneys push advocacy in new directions and accept media opportunities, they are encountering many of the ethical questions journalists have long wrestled with: does the individual or the larger narrative take precedence? When does “storytelling” become exploitative?...
Brendon Woods, the chief defender in Oakland’s Alameda County Public Defender’s Office, has been in the field since the late 90s when there were a few shared desktop computers available for public defenders to use. He sees plenty of law enforcement outfits on Twitter and said the voices of public defenders have been game-changing. They have power, he said, but that power must be used with care. “Our clients, they’ve been dehumanized by the system so much, you don’t need to have it happen from people who are tweeting or posting stories on Facebook,” he said, “and without any thought or strategy being put into them or why they're doing it.”
February 19, 2021 in Procedure and Proof at Sentencing, Technocorrections, Who Sentences | Permalink | Comments (1)
Monday, February 15, 2021
"Just Let People Have Cellphones in Prison"
The title of this post is the title of this notable new Slate commentary authored by Hannah Riley. I recommend the full piece, and here is how it starts:
In 2017, a man named Willie Nash was booked into a Mississippi county jail on a misdemeanor charge. For reasons that aren’t clear, his cellphone wasn’t confiscated as the law dictated. When he asked a jailer for a charger, the phone — which he had been using to text his wife — was seized. Nash was then sentenced to 12 years for possessing the cellphone. The case went all the way up to the Mississippi Supreme Court, where the 12-year sentence was affirmed. “While obviously harsh,” Justice James D. Maxwell II wrote for the court, “Nash’s twelve-year sentence for possessing a cell phone in a correctional facility is not grossly disproportionate.” Mr. Nash, a father of three, will be released back to his family in January of 2029, for the crime of texting his wife from jail.
In all federal and state prisons and jails, personal cellphones are classified as contraband — illegal for incarcerated people to possess. Incarcerated people are allowed to communicate with loved ones via letters, expensive phone calls in a centralized location (done through a prepaid account or collect calls, for a limited amount of time), or sometimes through expensive email and video messages on a prison-issued tablet. Due to COVID-19, in-person visitation has been halted in most prisons and jails since last March.
These rigid policies isolate incarcerated people and weaken their ties to friends and family. And this isolation radiates harm well beyond each individual. The vast majority of the millions of people currently incarcerated in this country will, at some point, be released. Every year, roughly 600,000 people leave prisons across the U.S., and a much higher number cycle in and out of jails. Roughly 2.7 million children in the U.S. have an incarcerated parent.... There is a wealth of research that confirms that the stronger the relationships and connections to loved ones and community, the better a person will fare once they are released from prison or jail. We’ve known this for a long time. A study from 1972 noted that, “The central finding of this research is the strong and consistent positive relationship that exists between parole success and maintaining strong family ties while in prison.” Decades later, the findings remain the same. “Incarcerated men and women who maintain contact with supportive family members are more likely to succeed after their release,” a 2012 Vera Institute report found.
There is one obvious way to facilitate these community ties: allow incarcerated people to have cellphones. For more than a decade, jailers and elected officials have attempted to incite a moral panic in the general public around the danger of cellphones, warning that incarcerated people would only use them to organize hits and buy drugs and run gangs on the outside.
It’s true that some incarcerated people have used contraband phones to extort people on the outside. But targeting the tools rather than the roots of the corruption and violence within prisons is misguided. A full decade ago, the New York Times conceded that the harsh penalties and increased vigilance weren’t working to keep phones out of prisons: “The logical solution would be to keep all cellphones out of prison. But that is a war that is being lost, corrections officials say.” That hasn’t changed. If you want to find a cellphone in prison or jail now, you can. One former sheriff in South Carolina even allowed detainees in his jail to purchase cheap cellphones from commissary, arguing access to cellphones actually improves safety....
The reality is that prisons and jails are already saturated with cellphones (mostly smuggled in by correctional officers), and the vast majority of people use them in the exact same ways the vast majority use them on the outside: to stay connected. To stave off boredom. To learn. To laugh.
February 15, 2021 in Prisons and prisoners, Technocorrections | Permalink | Comments (0)
Monday, October 05, 2020
"Remote Criminal Justice"
The title of this post is the title of this timely new article authored by Jenia Iontcheva Turner now available via SSRN. Here is its abstract:
The coronavirus pandemic has forced courts to innovate to provide criminal justice while protecting public health. Many have turned to online platforms in order to conduct criminal proceedings without undue delay. The convenience of remote proceedings has led some to advocate for their expanded use after the pandemic is over. To assess the promise and peril of online criminal justice, I surveyed state and federal judges, prosecutors, and defense attorneys across Texas, where virtual proceedings have been employed for a range of criminal proceedings, starting in March 2020. The survey responses were supplemented with direct observations of remote plea hearings and the first criminal jury trial conducted via Zoom.
The survey responses paint a complicated picture. They suggest that, on the whole, online proceedings can save time and resources for the participants in criminal cases and can provide broader access to the courts for the public. Yet respondents also noted the dangers of remote justice, particularly in contested or evidentiary hearings and trials. These include the inability of the parties to present evidence and confront witnesses effectively, and the challenges of providing adequate legal assistance remotely. Respondents also expressed concern that the court’s perception of defendants may be negatively skewed by technology and that indigent defendants might be disproportionately harmed by the use of remote hearings. Defense attorneys were especially likely to be concerned about the use of the online format and to believe that it tends to harm their clients. Federal judges and prosecutors were also more likely than their state counterparts to be skeptical of the benefits of online criminal proceedings outside the context of the pandemic.
Based on the survey responses, an analysis of scholarship and case law, and first-hand observations of virtual criminal proceedings, the Article concludes with several recommendations about the future use of online criminal justice. It argues that states should be wary of expanding the use of remote proceedings after the pandemic is over. Online technology could be used more broadly to conduct status hearings and hearings on questions of law and to increase the frequency of attorney-client consultations. Beyond these narrow circumstances, however, remote hearings post-pandemic should be used only sparingly, as they carry too many risks to the fairness of the proceedings. If jurisdictions make the choice to use virtual proceedings in circumstances beyond status hearings and legal arguments, this should be done only after obtaining an informed and voluntary consent from the defendant, and with great care taken to reduce the risks of unfairness and unreliable results.
October 5, 2020 in Impact of the coronavirus on criminal justice, Procedure and Proof at Sentencing, Technocorrections | Permalink | Comments (0)
Saturday, September 26, 2020
"The Perils of 'Old' and 'New' in Sentencing Reform"
The title of this post is the title of this notable new essay authored by Jessica Eaglin now available via SSRN. Here is its abstract:
The introduction of actuarial risk assessment tools into the sentencing process is a controversial, but popular trend in the states. While tools' proliferation is debated from numerous angles, scholarship tends to emphasize why this reform is new or old, and focus on whether and how this trend may improve or undermine sentencing law and policy. This Essay suggests that the institutionalization of actuarial risk assessments into the sentencing process in response to social and political critiques of criminal administration is both a new and old idea. It situates the proliferation of actuarial risk assessments in the context of technical guidelines created to structure and regulate judicial sentencing discretion in the 1980s and beyond. It then examines debates about two conceptual issues — selective incapacitation and equality — to highlight that technical sentencing reforms raise recurring questions at sentencing, even as social perspectives on resolving those questions are shifting.
Rather than using the "old" nature of these issues as evidence that actuarial risk assessments should proliferate, however, this Essay urges critical reflection on the turn toward the technical in the present day, in the face of mass incarceration. It urges scholars to dispense of the "old" and "new" concept when reflecting on whether and why actuarial risk assessments are proliferating in the states. It also encourages scholars to draw on the expansive methodological approaches applied to study of sentencing guidelines when considering this reform going forward.
September 26, 2020 in Federal Sentencing Guidelines, Procedure and Proof at Sentencing, Purposes of Punishment and Sentencing, Race, Class, and Gender, State Sentencing Guidelines, Technocorrections, Who Sentences | Permalink | Comments (0)
Tuesday, September 22, 2020
"Virtual Reality: Prospective Catalyst for Restorative Justice"
The title of this post is the title of this new article now on SSRN authored by Kate Bloch. Here is its abstract:
A 2018 U.S. Department of Justice report assessing data from 30 states found that 83% of individuals released from state prisons in 2005 were rearrested within nine years. When a revolving door ushers five of six individuals back into custody and decimates communities, more effective approaches to criminal justice demand attention. In countries around the world, restorative justice has been emerging as a promising candidate. It generally involves an interactive process in which stakeholders identify and grapple with harms caused by the crime.
But many environments lack the resources to invoke its benefits. While restorative justice takes various forms, the crux of each variant involves perspective taking — seeing the harm and its consequences through the eyes of those who experienced it. Cognitive science research suggests that the emerging technology of virtual reality provides an innovative and often especially compelling approach to perspective taking. Embodying an avatar offers the opportunity to experience the world as another and could make virtual perspective-taking encounters a valuable introduction for subsequent in-person encounters or offer a perspective-taking opportunity when in-person encounters are not practical or prudent. This analysis explores how virtual reality could become a catalyst for restorative justice.
September 22, 2020 in Purposes of Punishment and Sentencing, Technocorrections | Permalink | Comments (0)
Friday, August 21, 2020
"Unmuted: Solutions to Safeguard Constitutional Rights in Virtual Courtrooms and How Technology Can Expand Access to Counsel and Transparency in the Criminal Justice System"
The title of this post is the title of this timely new paper now on SSRN authored by Matt Bender. Here is its abstract:
A defendant’s fundamental right to a public trial, and the press and community’s separate right to watch court have been threatened by the shift to virtual hearings. These independent constitutional rights can be in harmony in some cases and clash in others. They cannot be incompatible. Public interest in criminal justice transparency is increasingly crystallized, but courts have often become more opaque, which jeopardizes First and Sixth Amendment rights.
This paper addresses the conflict and confronts a key question: how can we be assured that remote and virtual hearings like Zoom arraignments or trials guarantee the same rights as traditional court hearings? Instead of rejecting virtual criminal hearings outright, new proposals are offered for how virtual courtrooms can safeguard constitutional rights. The prevailing belief that criminal defendants should reject virtual trials is questioned. Virtual trials may lead to better outcomes for defendants than traditional trials, specifically during the ongoing pandemic. Beyond preserving rights in a virtual courtroom, the ways technology can improve the criminal justice system are explored.
Through an analysis of existing indigent defense and First Amendment scholarship, the myth that traditional court decorum should trump open court and virtual hearings is addressed. Judicial legitimacy and transparency may benefit when criminal cases are accessible on virtual platforms or livestreamed. Transparency can help safeguard defendants’s rights and improve indigent clients’s representation and outcomes. Instead of disrupting the courtroom — whether a hearing is virtual or traditional — convenient public access helps a community learn more about the criminal justice system and evaluate cases, judges, and attorneys.
These proposals have significant implications for courts and clients by providing a framework for virtual litigation, and leveraging technology for a more equitable criminal justice system. Livestreams and virtual, remote hearings can improve the right of representation for indigent defendants by increasing access to quality counsel, reducing costs, creating a more competitive legal market, and expanding a client’s choice of attorneys.
August 21, 2020 in Impact of the coronavirus on criminal justice, Procedure and Proof at Sentencing, Technocorrections | Permalink | Comments (0)
Thursday, August 13, 2020
"Making Sense of Risk"
The title of this post is the title of this new paper authored by Cecelia Klingele recently posted to SSRN. Here is its abstract:
Although actuarial risk prediction tools are widely used in the American criminal justice system, the lawyers, judges, and correctional workers who consult these products in making decisions often misunderstand fundamental aspects of how they work and what information they provide. This article suggests that the best way to ensure risk assessment tools are being used in ways that are just and equitable is to ensure that those who use them better understand three key aspects of what information they do — and do not — reveal. Doing so requires clarifying what risk is being predicted, explaining what risk levels signify, and enumerating how risk-related information is and is not relevant to specific criminal justice decisions.
August 13, 2020 in Procedure and Proof at Sentencing, Technocorrections, Who Sentences | Permalink | Comments (0)
Saturday, July 25, 2020
Is consideration of gender a must or a no-no in risk assessment tools?
The question in the title of this post is prompted by this latest new article in the on-going terrific Law.com/Legaltech News series unpacking modern risk assessment tools. The headline and full subheadline of the piece reveals why it prompts this question: "Constitutional Brawl Looms Over How Risk Assessment Tools Account for Gender: Researchers say that scoring men and women differently is essential to account for risk assessment tools’ inherent gender bias. But it’s an open question whether these adjustments are violating state or constitutional law." I recommend the piece in full, and here are excerpts:
While there’s been a lot of focus on how risk assessment tools treat different racial demographics, little attention has been paid to another issue that may be just as problematic: how gender factors into risk scores. Researchers say accounting for the differences in gender ensures that risk assessments are more accurate, but exactly how they do so may run into constitutional challenges.
Legaltech News found one gender-specific risk assessment tool currently implemented in at least two states: the Women’s Risk and Needs Assessment (WRNA), which like the the Ohio Risk Assessment System (ORAS), was created by the University of Cincinnati. Kansas uses the WRNA to assess parolees’ risk in a women’s prison in Topeka, while Montana deploys it for women on probation or parole throughout its Department of Corrections....
The reason WRNA is needed in the first place is because most risk assessment instruments are validated (i.e. created) on a population that is majority male, in large part due to current gender imbalance in the criminal justice system (i.e. more men than women commit crimes and become incarcerated).
Dr. Teresa May, department director of the Community Supervision & Corrections Department in Harris County, Texas, also notes that on a whole, men have higher recidivism risk than women. “What we know is when you look at gender, almost always—and in fact I don’t know of an exception—the average rearrest rate [for women] is always much lower than men.” Without accounting for these differences, a risk assessment could end up scoring women as higher risk than they actually are....
There’s an ongoing debate over whether using gender as a risk factor, or assigning different cutoff risk levels to both males and females, violates the 14th Amendment. “Basically the Supreme Court of the U.S. has pursued what’s called an anti-classification approach to the equal protection law, which prohibits explicit use of factors like gender and race in making decisions,” says Christopher Slobogin, professor of law at the Vanderbilt University Law School. He adds, “It is permissible, constitutionally, to use race or gender if there is a compelling state interest in doing so. But generally speaking, the use of race and gender is unconstitutional to discriminate between groups.”
However, in Slobogin’s own opinion, he does not think the “Constitution is violated simply because a risk assessment arrives at different results for similarly situated men and women.” He argues that a tool that uses gender as a risk factor and one that has different cutoff scores for genders are functionally the same, adding in those adjustments makes the instruments more accurate.
But others see it differently. Sonja Starr, professor of law at the University of Michigan Law School, for example, recently told the Philadelphia Inquirer that “use of gender as a risk factor plainly violates the Constitution.”
July 25, 2020 in Offender Characteristics, Procedure and Proof at Sentencing, Race, Class, and Gender, Technocorrections, Who Sentences | Permalink | Comments (0)
Tuesday, July 14, 2020
"The United States of Risk Assessment: The Machines Influencing Criminal Justice Decisions"
The title of this post is the title of this very useful Law.com/Legaltech News article and related research project by Rhys Dipshan, Victoria Hudgins and Frank Ready. The subtitle of the piece provides an overview: "In every state, assessment tools help courts decide certain cases or correctional officers determine the supervision and programming an offender receives. But the tools each state uses varies widely, and how they're put into practice varies even more." This companion piece, titled "The Most Widely Used Risk Assessment Tool in Each U.S State," provides this introduction:
There are dozens of risk assessment tools in use in local criminal justice systems around the country. Not all have a far reaching impact, such as those specialized to a specific risk like domestic violence or those assessing risk for a certain demographic like juvenile offenders. Tools that have the broadest impact and deployment, however, are ones that look at recidivism pretrial risk in adult populations.
Below, we highlight these specific tools in use in each state, and the criminal justice decisions point they influence. These findings are part of a broader research project examining how jurisdictions implement risk assessment tools, and how they determine they accurately work and are implemented as intended. The project also dives into how risk assessment tools generate their scores and the debate around whether these instrument exacerbate or mitigate bias in criminal justice decision making.
July 14, 2020 in Data on sentencing, Detailed sentencing data, Procedure and Proof at Sentencing, State Sentencing Guidelines, Technocorrections, Who Sentences | Permalink | Comments (2)
Wednesday, June 10, 2020
"Sentenced to Surveillance: Fourth Amendment Limits on Electronic Monitoring"
The title of this post is the title of this notable new paper authored by Kate Weisburd and recently posted to SSRN. Here is its abstract:
As courts and legislatures increasingly recognize that “digital is different” and attempt to limit government surveillance of private data, one group is conspicuously excluded from this new privacy-protective discourse: the five million people in the United States on probation, parole, or other forms of community supervision. This Article is the first to explore how warrantless electronic surveillance is dramatically transforming community supervision and,as a result, amplifying a growing privacy-protection disparity: those in the criminal legal system are increasingly losing privacy protections even while those not in the system are increasingly gaining privacy protections. The quickly expanding use of GPS-equipped ankle monitors, as well as other forms of electronic searches, reflects unprecedented government surveillance that has yet to be regulated, scrutinized, or limited in any meaningful way.
This Article explores this phenomenon in its own right but also contends that the expanding disparity in privacy protections is explained by two underappreciated but significant shifts in Fourth Amendment jurisprudence. First, on the theory that defendants “choose” surveillance in exchange for avoiding incarceration, courts increasingly invoke consent to justify otherwise unconstitutional surveillance of people on community supervision. While the debate over criminal justice bargaining is not new, the expanded reliance on consent in this context reveals blind spots in the existing debate. Second, courts also increasingly accept government arguments in favor of otherwise unconstitutional electronic monitoring under a general “reasonableness” standard, as opposed to the traditional “special needs” doctrine. This insidious shift toward “reasonableness” threatens to jeopardize the precise interests the Fourth Amendment was designed to protect. But even under a reasonableness standard, electronic surveillance of people on community supervision should be more circumscribed. Ultimately, this Article reveals how the significance of these two shifts extends beyond electronic surveillance and represents a new frontier of sanctioning warrantless searches without any level of suspicion or exception to the warrant requirement.
June 10, 2020 in Criminal Sentences Alternatives, Prisons and prisoners, Procedure and Proof at Sentencing, Reentry and community supervision, Technocorrections, Who Sentences | Permalink | Comments (0)
Thursday, February 20, 2020
"From Decarceration to E-Carceration"
I am sorry to have missed this article by Chaz Arnett with the title used for the title of this post when it was first posted to SSRN some months ago, but I am glad to have seen it as recently revised. Here is its abstract:Each year, millions of Americans experience criminal justice surveillance through electronic ankle monitors. These devices have fundamentally altered our understanding of incarceration, punishment, and the extent of the carceral state, as they are increasingly offered as moderate penal sanctions and viable solutions to the problem of mass incarceration. They purportedly enable decarceration, albeit with enhanced surveillance in the community as the compromise. Proponents of the devices tout the public safety and cost benefits while stressing the importance of depopulating prisons and returning individuals to their communities. In recent years, an oppositional movement has developed, focused on highlighting the social harms of electronic monitoring as part of a burgeoning e-carceration regime, where digital prisons arise, not as substitutes to brick and mortar buildings, but as net-widening correctional strategy operationalized to work in tandem.
This Paper examines this debate on the effectiveness of electronic ankle monitors using a social marginalization framework. It argues that the current scholarly debate on the use of electronic ankle monitors is limited because it fails to consider the potential harm of social marginalization, particularly for historically subordinated groups subjected to this form of surveillance. It uses system avoidance theory to elucidate the argument that intensive criminal justice surveillance has the counterproductive effect of causing those subjected to surveillance to avoid institutions necessary for adequate reintegration and reduction in recidivism. It offers a theory of the carceral state as malleable, extending beyond prison walls, expanding our carceral reality, and placing great strains on privacy, liberty, and democratic participation. Ultimately, it stresses that a move from decarceration to e-carceration, or from mass incarceration to mass surveillance, will likely fail to resolve, and may exacerbate, one of the greatest harms of mass incarceration: the maintenance of social stratification. Thus, adequately addressing this challenge will demand a more robust and transformative approach to criminal justice reform that shifts a punitive framework to a rehabilitative one focused on proven methods of increasing defendants’ and former offenders’ connections to their community and civic life, such as employment assistance programming, technical and entrepreneurial skill development, supportive housing options, and mental health services.
February 20, 2020 in Criminal Sentences Alternatives, Prisons and prisoners, Race, Class, and Gender, Reentry and community supervision, Scope of Imprisonment, Technocorrections | Permalink | Comments (0)
Tuesday, October 08, 2019
"Offline: Challenging Internet and Social Media Bans for Individuals on Supervision for Sex Offenses"
The title of this post is the title of this new article authored by Jacob Hutt now available via SSRN. Here is its abstract:
Tens of thousands of people across the United States are subject to bans on their Internet and social media access due to sex offense convictions. This Article explains why, even for those on parole and probation, such bans are frequently over-broad, imposed on the wrong people, and are now ripe for challenge in light of the Supreme Court’s 8-0 decision in Packingham v. North Carolina.
The first flaw with these bans is their mismatch between crime and condition. They are imposed on individuals whose criminal records have no relation to online predatory activity or manipulation of minors. The second flaw is their extreme over-breadth. Rather than merely proscribing speech with minors or access to certain online forums, they cordon off the Internet itself, ostracizing offenders to an offline society. While these flaws rendered Internet and social media bans constitutionally problematic before the Packingham decision, the Supreme Court’s imprimatur on free speech for individuals convicted of sex offenses could — and should — lead the way to future legal challenges of these bans.
October 8, 2019 in Reentry and community supervision, Sex Offender Sentencing, Technocorrections | Permalink | Comments (1)
Sunday, August 18, 2019
North Carolina Supreme Court holds mandatory lifetime GPS monitoring for some sex offenders violates Fourth Amendment
Four+ years ago as noted in this post, the US Supreme Court issued a short per curiam summary reversals in Grady v. North Carolina, No. 14-593 (S. Ct. March 30, 2015) (available here), in which the Court clarified and confirmed that the Fourth Amendment is applicable to sex offender monitoring. That case was remanded back to the state courts, and late last week there was a major ruling by the Supreme Court of North Carolina in North Carolina v. Grady, No. 179A14-3 (N.C. Aug 16, 2019) (available here). This split ruling establishes that persons other than Torrey Grady will benefit from the application of the Fourth Amendment in this setting. Here is part of the start of the majority opinion (authored by Justice Earls) in this latest version of Grady:
The United States Supreme Court has determined that North Carolina’s satellite-based monitoring (SBM) of sex offenders, which involves attaching an ankle monitor “to a person’s body, without consent, for the purpose of tracking that individual’s movements,” constitutes a search within the meaning of the Fourth Amendment. Grady v. North Carolina, 135 S. Ct. 1368, 1370 (2015) (per curiam). The Supreme Court remanded the case for an examination of “whether the State’s monitoring program is reasonable — when properly viewed as a search.” Id. at 1371....
In accordance with this decision, this case was ultimately remanded to the superior court, which entered an order determining the SBM program to be constitutional. The Court of Appeals reversed, but only as to Mr. Grady individually. We conclude that the Court of Appeals erroneously limited its holding to the constitutionality of the program as applied only to Mr. Grady, when our analysis of the reasonableness of the search applies equally to anyone in Mr. Grady’s circumstances. Cf. Graham v. Florida, 560 U.S. 48, 82 (2010) (holding that state statutes mandating a sentence of life imprisonment without the possibility of parole are unconstitutional as applied to a specific group, namely juveniles who did not commit homicide).
In North Carolina, “SBM’s enrollment population consists of (1) offenders on parole or probation who are subject to State supervision, (2) unsupervised offenders who remain under SBM by court order for a designated number of months or years, and (3) unsupervised offenders subject to SBM for life, who are also known as ‘lifetime trackers.’ ” State v. Bowditch, 364 N.C. 335, 338, 700 S.E.2d 1, 3 (2010). Mr. Grady is in the third of these categories in that he is subject to SBM for life and is unsupervised by the State through probation, parole, or post-release supervision. Additionally, Mr. Grady is a “recidivist,” which makes lifetime SBM mandatory as to him without any individualized determination of the reasonableness of this search. Because we conclude that the relevant portions of N.C.G.S. §§ 14-208.40A(c) and 14- 208.40B(c) are unconstitutional as applied to all individuals who, like Mr. Grady, are in the third Bowditch category and who are subject to mandatory lifetime SBM based solely on their status as a “recidivist,” we modify and affirm the opinion of the Court of Appeals.
And here is a paragraph from the start of the dissenting opinion authored by Justice Newby:
Using the remand as an opportunity to make a broad policy statement, the majority, though saying it addresses only one statutory classification, recidivist, applies an unbridled analysis which understates the crimes, overstates repeat sex offenders’ legitimate expectations of privacy, and minimizes the need to protect society from this limited class of dangerous sex offenders. The majority’s sweeping opinion could be used to strike down every category of lifetime monitoring under the SBM statute.
August 18, 2019 in Collateral consequences, Criminal Sentences Alternatives, Procedure and Proof at Sentencing, Sex Offender Sentencing, Technocorrections, Who Sentences | Permalink | Comments (1)
Monday, August 05, 2019
Are pretrial risk assessment algorithms really part of "socialist agendas that are sweeping this country"?
The question in the title of this post is prompted by this curious new Fox News commentary authored by US Senator John Kennedy under the headline "Bail, bond decisions are being made today with algorithms -- That puts your safety at risk." Here are excerpts:
Jurisdictions across the U.S. are snapping up algorithms as tools to help judges make bail and bond decisions. They’re being sold as race- and gender-neutral assessments that allow judges to use science in determining whether someone will behave if released from jail pending trial.
Really, they’re a dangerous collision of the poorly vetted cost cuts and socialist agendas that are sweeping this country.
The algorithms scare me because they’re being implemented for the same reason as the early release programs that are getting people killed. The goal isn’t to protect public safety. It’s to empty jail cells and release dangerous criminals on their own recognizance.
As a member of the Senate Judiciary Committee, I’m concerned about the recklessness of public policy that endangers people’s lives, especially in minority communities, where crime often is such a scourge. These algorithms -- called pretrial assessment release tools -- are the equivalent of using a Magic 8 ball in courtrooms. The results are disastrous to communities and great for criminals.
In my home state of Louisiana, New Orleans decided a few years ago to reduce the jail population. City officials started using a pretrial assessment release tool that was available for free from a nonprofit founded by a former hedge fund manager who became a billionaire through risky investments that turned into gold.
Do you know what happens when you allow a hedge fund manager to restructure your criminal justice system? You get a model that’s fraught with risk.
The new tool comes into play when someone is arrested on a felony charge, such as robbery or rape. The tool comes up with a score of one to five based on the defendant’s age, criminal history and several other factors. A “one” is considered a low risk to public safety. A “five” is considered justification for maximum supervision.
You would think that a risk level of “one” would be limited to people who jaywalk or shoplift. You would be wrong. In practice, a “five” apparently is reserved for people who kill busloads of nuns. Ordinary thugs get a “one” as long as they promise that they’ll spend all their time in church and attend every court appearance. They don’t have to regularly check in with a court officer or even call once a month....
The Metropolitan Crime Commission found that 37.6% of the people arrested for violent felonies in New Orleans during the third and fourth quarters of 2018 received the lowest risk level of “one.” That included more than 32% of the people arrested for homicide and 36.5% of the people arrested for rape.
Algorithms diminish public safety in this country. They ask us to pretend that lengthy arrest records and violent crimes don’t matter. They ask police to scoop up the bad guys only for the courts to immediately release them. They turn us into a bad joke.
The use of risk assessment algorithms, whether pretrial or at sentencing or in the prison system, is an important modern criminal justice development that justifies much scrutiny and can be criticized on many grounds. But this commentary by Senator Kennedy reads a bit like a parody.
For starters, one of the main reasons risk assessments are appealing is because judicial decision-making without the help of data can itself often seem a lot like "Magic 8 ball" decision-making. Moreover, all sound risk-assessment tools factor in arrest records and violent crimes, so they cannot properly be attacked for pretending that these past acts "don’t matter." And, most amusingly, I cannot quite fathom how efforts to make criminal justice decisions based on useful and relevant data amounts to part of "socialist agendas."
I would welcome Senator Kennedy encouraging the Senate Judiciary Committee to hold hearings about the pros and cons of using risk assessment algorithms in modern criminal justice systems. But, since he suggests giving judges more information is part of "socialist agendas that are sweeping this country," I worry he might think informing Senators more about these matters also somehow has mysterious sinister socialist undertones.
August 5, 2019 in Elections and sentencing issues in political debates, Procedure and Proof at Sentencing, Technocorrections, Who Sentences | Permalink | Comments (6)
Sunday, June 16, 2019
"Science and Ethics of Algorithms in the Courtroom"
The title of this post is the title of this new paper authored by Kia Rahnama now available via SSRN. Here is its abstract:
This Article analyzes the societal and cultural impacts of greater reliance on the use of algorithms in the courtroom. Big-data analytics and algorithms are beginning to play a large role in influencing judges’ sentencing and criminal enforcement decisions. This Article addresses this shift toward greater acceptance of algorithms as models for risk-assessment and criminal forecasting within the context of moral and social movements that have shaped the American justice system’s current approach to punishment and rehabilitation.
By reviewing salient problems of scientific uncertainty that accompany the use of these models and algorithms, the Article calls into question the proposition that greater reliance on algorithms in the courtroom can lead to a more objective and fair criminal sentencing regime. Far from liberating the society from the biases and prejudices that might pollute judges’ decision-making process, these tools can intensify, while simultaneously concealing, entrenched cultural biases that preexist in the society.
Using common themes from the field of Science and Technology Studies (STS), including boundary-work analysis and Public Understanding of Science (PUS), this Article highlights unique technical characteristics of big-data analytics and algorithms that feed into undesirable and deeply-held values and beliefs. This Article draws attention to specific gaps in technical understanding of algorithmic thinking, such as the black box of algorithms, that can have discordant impact on communicating uncertainty to the populace and reduce accountability and transparency in regulating the use of algorithms. This Article also provides specific policy proposals that can ameliorate the adverse social and cultural effects of incorporating algorithms into the courtroom. The discussion of policy proposals borrows from the STS literature on public participation in science and encourages adoption of a policy that incorporates diverse voices from political actors, most affected communities, and the offenders themselves.
June 16, 2019 in Procedure and Proof at Sentencing, Race, Class, and Gender, Technocorrections | Permalink | Comments (0)
Saturday, June 01, 2019
Another notable study spotlighting a now ubiquitous technology to help explain the great crime decline
One of many major mysteries in this modern history of US crime and punishment is just why crime rates rose so dramatically in the 1970s and 1980s and then the fell dramatically in the 1990s and 2000s. Lots of folks have lots of data to support lots of ideas, and this recent Atlantic piece by Alexis Madrigil, headlined "The Collapsing Crime Rates of the ’90s Might Have Been Driven by Cellphones," provides perhaps another piece of the story. Here are excerpts from the piece (with a few links preserved) that provides a nice review of the state of this debate:
It’s practically an American pastime to blame cellphones for all sorts of societal problems, from distracted parents to faltering democracies. But the devices might have also delivered a social silver lining: a de-escalation of the gang turf wars that tore up cities in the 1980s. The intriguing new theory suggests that the arrival of mobile phones made holding territory less important, which reduced intergang conflict and lowered profits from drug sales.
Lena Edlund, a Columbia University economist, and Cecilia Machado, of the Getulio Vargas Foundation, lay out the data in a new National Bureau of Economic Research working paper. They estimate that the diffusion of phones could explain 19 to 29 percent of the decline in homicides seen from 1990 to 2000.
“The cellphones changed how drugs were dealt,” Edlund told me. In the ’80s, turf-based drug sales generated violence as gangs attacked and defended territory, and also allowed those who controlled the block to keep profits high. The cellphone broke the link, the paper claims, between turf and selling drugs. “It’s not that people don’t sell or do drugs anymore,” Edlund explained to me, “but the relationship between that and violence is different.”
Edlund and Machado used Federal Communications Commission data on cellular-infrastructure deployment and matched it against the FBI’s (admittedly spotty) database on homicides across the country. They demonstrated a negative relationship that was even stronger for black and Latino populations. The title of their paper suggests that a crucial aspect of understanding declining crime has been hiding in plain sight for years: “It’s the Phone, Stupid: Mobiles and Murder.”
Their theory is the latest entry in a series of attempts to explain the components of the long-term decline in crime that began in the early 1990s. The rise and fall of crime in the late 20th century (and into the 21st) is one of the great mysteries of social science. No one has come up with an explanation that fully—and incontestably—accounts for the falling crime rates. Many have tried, and shown substantial initial results, only to have their findings disputed.
Edlund and Machado are not the first to suggest that phones could have played a role in the decline. Among others, the criminologists Erin Orrick and Alex Piquero were able to show that property crime fell as cellphone-ownership rates climbed. The first paper on the cellphone-crime link suggested that phones were an “underappreciated” crime deterrent, as mobile communications allow illegal behavior to be reported more easily and quickly.
But cellphones are far from the only possible explanation. Any measurement that was going up in the ’90s correlates with the decline of violence. Thus, there are probably too many theories out there, each with limited explanatory power. One commonsense argument that’s been made is that certain police tactics (say, stop-and-frisk or the “broken windows” approach) or the explosion of incarceration rates must have been responsible for the decline, but most careful reviews have found little evidence to suggest that they had more than a marginal impact.
The University of New Haven criminologist Maria Tcherni-Buzzeo published a review of the contending theories in 2018 that found no fewer than 24 different explanations for why crime began a multi-decade decline in the early 1990s, through economic times good and bad, in different countries and cities, under draconian policing regimes and more progressive ones.
Every theory has its proponents and detractors. For example, the economists Steven Levitt and John Donohue proposed (and doubled down on) the idea that legalizing abortion reduced crime rates by cutting down on the number of unwanted pregnancies and children born into situations that make them more likely to fall into criminal life. Tcherni-Buzzeo described the theory as “thoroughly debunked by empirical research” in a 2018 book chapter looking at the theories behind the crime decline. Yet Levitt and Donohue’s most recent research, published as a working paper this month, contends they were even more right all along than they’d thought, and that the “cumulative impact of legalized abortion on crime is roughly 45 percent, accounting for a very substantial portion of the roughly 50–55 percent overall decline from the peak of crime in the early 1990s.”
June 1, 2019 in National and State Crime Data, Technocorrections | Permalink | Comments (1)
Friday, May 31, 2019
"Beyond Bias: Re-Imagining the Terms of ‘Ethical AI’ in Criminal Law"
The title of this post is the title of this notable new paper authored by Chelsea Barabas. Here is its abstract:
Data-driven decision-making regimes, often branded as “artificial intelligence,” are rapidly proliferating across the US criminal justice system as a means of predicting and managing the risk of crime and addressing accusations of discriminatory practices. These data regimes have come under increased scrutiny, as critics point out the myriad ways that they can reproduce or even amplify pre-existing biases in the criminal justice system. This essay examines contemporary debates regarding the use of “artificial intelligence” as a vehicle for criminal justice reform, by closely examining two general approaches to, what has been widely branded as, “algorithmic fairness” in criminal law: 1) the development of formal fairness criteria and accuracy measures that illustrate the trade-offs of different algorithmic interventions and 2) the development of “best practices” and managerialist standards for maintaining a baseline of accuracy, transparency and validity in these systems.
The essay argues that attempts to render AI-branded tools more accurate by addressing narrow notions of “bias,” miss the deeper methodological and epistemological issues regarding the fairness of these tools. The key question is whether predictive tools reflect and reinforce punitive practices that drive disparate outcomes, and how data regimes interact with the penal ideology to naturalize these practices. The article concludes by calling for an abolitionist understanding of the role and function of the carceral state, in order to fundamentally reformulate the questions we ask, the way we characterize existing data, and how we identify and fill gaps in existing data regimes of the carceral state.
May 31, 2019 in Offender Characteristics, Procedure and Proof at Sentencing, Technocorrections, Who Sentences | Permalink | Comments (0)
Tuesday, May 07, 2019
"Report on Algorithmic Risk Assessment Tools in the U.S. Criminal Justice System"
The title of this post is the title of this notable new report "written by the staff of the Partnership on AI (PAI) and many of [its] Partner organizations." Here is part of the report's executive summary:
This report documents the serious shortcomings of risk assessment tools in the U.S. criminal justice system, most particularly in the context of pretrial detentions, though many of our observations also apply to their uses for other purposes such as probation and sentencing. Several jurisdictions have already passed legislation mandating the use of these tools, despite numerous deeply concerning problems and limitations. Gathering the views of the artificial intelligence and machine learning research community, PAI has outlined ten largely unfulfilled requirements that jurisdictions should weigh heavily and address before further use of risk assessment tools in the criminal justice system.
Using risk assessment tools to make fair decisions about human liberty would require solving deep ethical, technical, and statistical challenges, including ensuring that the tools are designed and built to mitigate bias at both the model and data layers, and that proper protocols are in place to promote transparency and accountability. The tools currently available and under consideration for widespread use suffer from several of these failures, as outlined within this document.
We identified these shortcomings through consultations with our expert members, as well as reviewing the literature on risk assessment tools and publicly available resources regarding tools currently in use. Our research was limited in some cases by the fact that most tools do not provide sufficiently detailed information about their current usage to evaluate them on all of the requirements in this report. Jurisdictions and companies developing these tools should implement Requirement 8, which calls for greater transparency around the data and algorithms used, to address this issue for future research projects. That said, many of the concerns outlined in this report apply to any attempt to use existing criminal justice data to train statistical models or to create heuristics to make decisions about the liberty of individuals.
Challenges in using these tools effectively fall broadly into three categories, each of which corresponds to a section of our report:
-- Concerns about the validity, accuracy, and bias in the tools themselves;
-- Issues with the interface between the tools and the humans who interact with them; and
-- Questions of governance, transparency, and accountability.
Although the use of these tools is in part motivated by the desire to mitigate existing human fallibility in the criminal justice system, it is a serious misunderstanding to view tools as objective or neutral simply because they are based on data. While formulas and statistical models provide some degree of consistency and replicability, they still share or amplify many weaknesses of human decision-making. Decisions regarding what data to use, how to handle missing data, what objectives to optimize, and what thresholds to set all have significant implications on the accuracy, validity, and bias of these tools, and ultimately on the lives and liberty of the individuals they assess....
In light of these issues, as a general principle, these tools should not be used alone to make decisions to detain or to continue detention. Given the pressing issue of mass incarceration, it might be reasonable to use these tools to facilitate the automatic pretrial release of more individuals, but they should not be used to detain individuals automatically without additional (and timely) individualized hearings. Moreover, any use of these tools should address the bias, human-computer interface, transparency, and accountability concerns outlined in this report.
This report highlights some of the key problems encountered using risk assessment tools for criminal justice applications. Many important questions remain open, however, and unknown issues may yet emerge in this space. Surfacing and answering those concerns will require ongoing research and collaboration between policymakers, the AI research community, and civil society groups. It is PAI’s mission to spur and facilitate these conversations and to produce research to bridge these gaps.
May 7, 2019 in Data on sentencing, Detailed sentencing data, Procedure and Proof at Sentencing, Purposes of Punishment and Sentencing, Technocorrections, Who Sentences | Permalink | Comments (1)
Wednesday, April 24, 2019
"How to Argue with an Algorithm: Lessons from the COMPAS ProPublica Debate"
The title of this post is the title of this notable new article authored by Anne Washington and now available via SSRN. Here is its abstract:
The United States optimizes the efficiency of its growing criminal justice system with algorithms however, legal scholars have overlooked how to frame courtroom debates about algorithmic predictions. In State v Loomis, the defense argued that the court’s consideration of risk assessments during sentencing was a violation of due process because the accuracy of the algorithmic prediction could not be verified. The Wisconsin Supreme Court upheld the consideration of predictive risk at sentencing because the assessment was disclosed and the defendant could challenge the prediction by verifying the accuracy of data fed into the algorithm.
Was the court correct about how to argue with an algorithm?
The Loomis court ignored the computational procedures that processed the data within the algorithm. How algorithms calculate data is equally as important as the quality of the data calculated. The arguments in Loomis revealed a need for new forms of reasoning to justify the logic of evidence-based tools. A “data science reasoning” could provide ways to dispute the integrity of predictive algorithms with arguments grounded in how the technology works.
This article’s contribution is a series of arguments that could support due process claims concerning predictive algorithms, specifically the Correctional Offender Management Profiling for Alternative Sanctions (“COMPAS”) risk assessment. As a comprehensive treatment, this article outlines the due process arguments in Loomis, analyzes arguments in an ongoing academic debate about COMPAS, and proposes alternative arguments based on the algorithm’s organizational context.
Risk assessment has dominated one of the first wide-ranging academic debates within the emerging field of data science. ProPublica investigative journalists claimed that the COMPAS algorithm is biased and released their findings as open data sets. The ProPublica data started a prolific and mathematically-specific conversation about risk assessment as well as a broader conversation on the social impact of algorithms. The ProPublica-COMPAS debate repeatedly considered three main themes: mathematical definitions of fairness, explainable interpretation of models, and the importance of population comparison groups.
While the Loomis decision addressed permissible use for a risk assessment at sentencing, a deeper understanding of daily practice within the organization could extend debates about algorithms to questions about procurement, implementation, or training. The criminal justice organization that purchased the risk assessment is in the best position to justify how one individual’s assessment matches the algorithm designed for its administrative needs. People subject to a risk assessment cannot conjecture how the algorithm ranked them without knowing why they were classified within a certain group and what criteria control the rankings. The controversy over risk assessment algorithms hints at whether procedural due process is the cost of automating a criminal justice system that is operating at administrative capacity.
April 24, 2019 in Data on sentencing, Procedure and Proof at Sentencing, Purposes of Punishment and Sentencing, Technocorrections, Who Sentences | Permalink | Comments (0)
Saturday, April 13, 2019
"Introducing Disruptive Technology to Criminal Sanctions: Punishment by Computer Monitoring to Enhance Sentencing Fairness and Efficiency"
The title of this post is the title of this paper recently posted to SSRN authored by Mirko Bagaric, Dan Hunter and Colin Loberg. Here is its abstract:
The United States criminal justice system is the most punitive on earth. The total correctional population is nearly seven million, equating to a staggering one in thirty-eight adults. Most of the correctional population comprises offenders who are on parole or probation. The financial burden this imposes on the community is prohibitive. Further, a high portion of offenders who are on parole or probation offend during the period of the sanction. This Article proposes an overdue solution to the crisis which exists in relation to the imposition of criminal sanctions. The solution is especially timely given that there is now a considerable consensus emerging among lawmakers and the wider community that reforms need to be implemented to reduce the cost of criminal sanctions and improve their effectiveness. Moreover, the United States Sentencing Commission has recently proposed an amendment to increase the availability of sentences which are alternatives to incarceration.
With little hint of exaggeration, the sentencing system remains in a primitive state when it comes to adopting technological advances. This Article seeks to address this failing as a means of overcoming the main shortcomings of current common criminal sanctions. Forty years ago, it was suggested that the most effective way to deal with crime was to assign a police officer to watch over the every move of each offender. The proposal was dubbed “cop-a-con”. This would nearly guarantee that offenders did not re-offend, while eliminating the adverse consequences of prison. This proposal was manifestly unviable due to its excessive costs. Technological advances now make the concept achievable in a cost-effective manner.
It is now possible to monitor the locations and actions of individuals in live time and to detect crime as it is in the process of being committed. Adapted properly to the criminal justice system, technology has the potential to totally reshape the nature and efficacy of criminal sanctions. The sanctions which are currently utilized to deal with the most serious offenders, namely imprisonment, probation and parole can be replaced with technological monitoring which can more efficiently, effectively and humanely achieve the appropriate objectives of sentencing. Technological disruption in the criminal justice sector is not only desirable, it is imperative. Financial pressures and normative principles mandate that the United States can no longer remain the world’s most punitive nation. The sanction suggested in the Article (“the monitoring sanction”) has the potential to more efficiently and economically impose proportionate punishment than current probation and parole systems, while enhancing public safety.
April 13, 2019 in Purposes of Punishment and Sentencing, Scope of Imprisonment, Technocorrections | Permalink | Comments (0)
Thursday, March 14, 2019
"The Source of the Stink: A Private Delegation Framework for Recidivism Risk Assessment"
The title of this post is the title of this recent paper authored by Andrea Nishi now available via SSRN. Here is its abstract:
This paper explores the use of privately developed risk assessment algorithms in criminal sentencing, arguing that these tools are developed in a way that hinders the enforcement of constitutional protections and gives private algorithm developers undue influence in sentencing determinations. Using the private delegation doctrine, which limits Congress's ability to delegate to private actors, the paper aims to strengthen the state statutory frameworks that govern the use of these tools to restore accountability to the sentencing process.
March 14, 2019 in Procedure and Proof at Sentencing, Technocorrections, Who Sentences | Permalink | Comments (0)
Saturday, March 09, 2019
"Criminal-Justice Apps"
The title of this post is the title of this interesting essay authored by by Adam Gershowitz recently posted to SSRN. Here is its abstract:
In recent years, lawyers, activists, and policymakers have introduced cell phone applications -- “criminal justice apps” -- that are very slowly beginning to democratize the criminal justice system. Some of these apps -- such as DWI apps -- teach citizens about the law, while also connecting them with lawyers and bail bondsman. Policymakers have created apps to help defendants navigate confusing court systems. Still other apps are grander in purpose and seek to bring systemic changes to the criminal justice system. For instance, the ACLU has designed an app to fight wrongful seizures and police brutality by recording police interactions. A non-profit has created an app to aggregate spare change and use it to post bail for the indigent. This essay explores the universe of criminal justice apps and considers how they are likely to result in modest improvements to the criminal justice system.
March 9, 2019 in Technocorrections, Who Sentences | Permalink | Comments (0)
Wednesday, January 30, 2019
"Beyond State v. Loomis: Artificial Intelligence, Government Algorithmization, and Accountability"
The title of this post is the title of this notable new paper now available via SSRN and authored by Han-Wei Liu, Ching-Fu Lin and Yu-Jie Chen. Here is its abstract:
Developments in data analytics, computational power, and machine learning techniques have driven all branches of the government to outsource authority to machines in performing public functions — social welfare, law enforcement, and most importantly, courts. Complex statistical algorithms and artificial intelligence (AI) tools are being used to automate decision-making and are having a significant impact on individuals’ rights and obligations. Controversies have emerged regarding the opaque nature of such schemes, the unintentional bias against and harm to underrepresented populations, and the broader legal, social, and ethical ramifications.
State v. Loomis, a recent case in the United States, well demonstrates how unrestrained and unchecked outsourcing of public power to machines may undermine human rights and the rule of law. With a close examination of the case, this Article unpacks the issues of the ‘legal black box’ and the ‘technical black box’ to identify the risks posed by rampant ‘algorithmization’ of government functions to due process, equal protection, and transparency. We further assess some important governance proposals and suggest ways for improving the accountability of AI-facilitated decisions. As AI systems are commonly employed in consequential settings across jurisdictions, technologically-informed governance models are needed to locate optimal institutional designs that strike a balance between the benefits and costs of algorithmization.
January 30, 2019 in Procedure and Proof at Sentencing, Technocorrections, Who Sentences | Permalink | Comments (0)
Sunday, December 16, 2018
Iowa Supreme Court dodges due process challenges to use of risk-assessment tools at sentencing
A helpful reader made sure I did not miss a trio of rulings handed down late last week by the Iowa Supreme Court which all raised issues concerning the permissibility of courts using risk-assessment tools at sentencing. The rulings came in Iowa v. Gordon, Iowa v. Guise and Iowa v. Buesing, and in each instance the court decided that a constitutional challenges to the use of Iowa Risk Revised risk assessment tool (IRR) at sentencing was not properly raised and preserved at sentencing. The Gordon case addresses this point most fully, and here is how the other cases describe the Gordon ruling:
Today, we filed an opinion in State v. Gordon, ____ N.W.2d ____ (Iowa 2018). In Gordon, we held a defendant could not raise this due process argument for the first time on appeal when the defendant did not bring the issue to the district court at the time of sentencing. Id. at ___. Furthermore, we held we could not address this due process issue under the rubric of ineffective assistance of counsel because the record is insufficient to reach this claim. Id.
Though the Gordon case has the fullest discussion of the merits in this trio of decisions, the Guise case is the best read because of the Justice Appel's extended opinion "concurring specially." This concurrence talks through various concerns about the use of risk-assessment instruments at sentencing (with lots of cites to lots of academic scholarship), and here are a few notable passages:
Guise’s argument that due process requires accurate information about risk assessments beyond a mere conclusion, as demonstrated by Malenchik and Loomis, is certainly not frivolous. Certainly the shiny legal penny of a new risk assessment tool should be carefully scrutinized by the courts.... The relentless and potentially corrosive drive for efficiency and certainty in a resource-scarce public sector should not drive courts to use risk assessments in an unjustified “off label” manner or in a fashion that otherwise lacks meaningful empirical support to drive sentencing.
Even if the emerging risk assessment tools are found to have a place in sentencing as a “relevant” factor, our law does not allow mere conclusions to be mounted on spikes and paraded around our courtrooms without statistical context....
We do not know whether the IRR was normed with an appropriate Iowa population. We do not know whether the tool has been renormed and monitored. We do not know anything, really, about the database, assuming there is a database, behind the IRR.
I am also concerned about process issues lurking behind this case. Ordinarily, the PSI report is made available to the defendant only a few days before sentencing.... But a few days’ notice is not enough time for a defendant to mount a serious challenge to the underlying reliability of the risk assessment evidence as being so unreliable as to be hocus pocus. A full-court press on the question of reliability of the risk assessment would likely require the hiring of a highly qualified expert. Even if the defendant does not wish to mount a full-blown attack on the statistical model and instead wishes to make a more limited point — say, for instance, the disproportionate impact of use of housing, employment, and level of educational attainment of people of color — the defense will not be able to develop the attack in a few days, particularly when the defendant is indigent and will require court approval prior to the hiring of an expert to challenge the statistical information....
In conclusion, I want to make clear that I do not categorically reject any use of risk assessment tools in the sentencing process. I recognize that the PEW Center on the States, the National Institute of Corrections, the National Center for State Courts, and the American Law Institute have all expressed interest in evidence-based sentencing. See J.C. Oleson, Risk in Sentencing: Constitutionally Suspect Variables and Evidence-Based Sentencing, 64 SMU L. Rev. 1329, 1343, 1394 (2011). I also recognize that sentencing based solely on “intuition” or “gut” runs the risk of allowing implied bias a free reign and can be lawless in nature. See Chris Guthrie et al., Blinking on the Bench: How Judges Decide Cases, 93 Cornell L. Rev. 1, 5 (2007) (urging the justice system to take steps to limit the impact of overreliance on intuition). Further, the “intuition” or “gut” of a judge who was a former prosecutor may well differ from the “intuition” or “gut” of a public defender. Undisciplined intuitive sentencing runs the risk of telling us more about the judge than the person being sentenced.
A fully-developed record may well show that risk and needs assessment tools that assemble variables in a statistically valid way may be of some assistance as a check on unregulated sentencing discretion and may promote deeper thinking by discretionary decision-makers into the sentencing process. In short, it is possible that when a full record is developed, properly designed and utilized risk assessment tools may enhance and inform the exercise of judicial discretion. In addition to the binary question of whether a risk assessment may or may not be used in sentencing, however, more nuanced additional questions must be asked regarding how any such tool may be used. In light of the procedural posture of this case and the companion cases, these questions must await further legal developments.
December 16, 2018 in Procedure and Proof at Sentencing, Purposes of Punishment and Sentencing, Sentences Reconsidered, State Sentencing Guidelines, Technocorrections, Who Sentences | Permalink | Comments (1)
Friday, November 09, 2018
Michelle Alexander frets about "The Newest Jim Crow"
Michelle Alexander has this notable new New York Times opinion piece headlined ""The Newest Jim Crow: Recent criminal justice reforms contain the seeds of a frightening system of 'e-carceration'." I recommend the piece in full, and here are excerpts:
Since 2010, when I published “The New Jim Crow” — which argued that a system of legal discrimination and segregation had been born again in this country because of the war on drugs and mass incarceration — there have been significant changes to drug policy, sentencing and re-entry, including “ban the box” initiatives aimed at eliminating barriers to employment for formerly incarcerated people.
This progress is unquestionably good news, but there are warning signs blinking brightly. Many of the current reform efforts contain the seeds of the next generation of racial and social control, a system of “e-carceration” that may prove more dangerous and more difficult to challenge than the one we hope to leave behind.
Bail reform is a case in point. Thanks in part to new laws and policies — as well as actions like the mass bailout of inmates in New York City jails that’s underway — the unconscionable practice of cash bail is finally coming to an end. In August, California became the first state to decide to get rid of its cash bail system; last year, New Jersey virtually eliminated the use of money bonds.
But what’s taking the place of cash bail may prove even worse in the long run. In California, a presumption of detention will effectively replace eligibility for immediate release when the new law takes effect in October 2019. And increasingly, computer algorithms are helping to determine who should be caged and who should be set “free.” Freedom — even when it’s granted, it turns out — isn’t really free.
Under new policies in California, New Jersey, New York and beyond, “risk assessment” algorithms recommend to judges whether a person who’s been arrested should be released. These advanced mathematical models — or “weapons of math destruction” as data scientist Cathy O’Neil calls them — appear colorblind on the surface but they are based on factors that are not only highly correlated with race and class, but are also significantly influenced by pervasive bias in the criminal justice system. As O’Neil explains, “It’s tempting to believe that computers will be neutral and objective, but algorithms are nothing more than opinions embedded in mathematics.”
Challenging these biased algorithms may be more difficult than challenging discrimination by the police, prosecutors and judges. Many algorithms are fiercely guarded corporate secrets. Those that are transparent — you can actually read the code — lack a public audit so it’s impossible to know how much more often they fail for people of color.
Even if you’re lucky enough to be set “free” from a brick-and-mortar jail thanks to a computer algorithm, an expensive monitoring device likely will be shackled to your ankle — a GPS tracking device provided by a private company that may charge you around $300 per month, an involuntary leasing fee. Your permitted zones of movement may make it difficult or impossible to get or keep a job, attend school, care for your kids or visit family members. You’re effectively sentenced to an open-air digital prison, one that may not extend beyond your house, your block or your neighborhood. One false step (or one malfunction of the GPS tracking device) will bring cops to your front door, your workplace, or wherever they find you and snatch you right back to jail.
Who benefits from this? Private corporations. According to a report released last month by the Center for Media Justice, four large corporations — including the GEO Group, one of the largest private prison companies — have most of the private contracts to provide electronic monitoring for people on parole in some 30 states, giving them a combined annual revenue of more than $200 million just for e-monitoring. Companies that earned millions on contracts to run or serve prisons have, in an era of prison restructuring, begun to shift their business model to add electronic surveillance and monitoring of the same population. Even if old-fashioned prisons fade away, the profit margins of these companies will widen so long as growing numbers of people find themselves subject to perpetual criminalization, surveillance, monitoring and control....
Many reformers rightly point out that an ankle bracelet is preferable to a prison cell. Yet I find it difficult to call this progress. As I see it, digital prisons are to mass incarceration what Jim Crow was to slavery.
If you asked slaves if they would rather live with their families and raise their own children, albeit subject to “whites only signs,” legal discrimination and Jim Crow segregation, they’d almost certainly say: I’ll take Jim Crow. By the same token, if you ask prisoners whether they’d rather live with their families and raise their children, albeit with nearly constant digital surveillance and monitoring, they’d almost certainly say: I’ll take the electronic monitor. I would too. But hopefully we can now see that Jim Crow was a less restrictive form of racial and social control, not a real alternative to racial caste systems. Similarly, if the goal is to end mass incarceration and mass criminalization, digital prisons are not an answer. They’re just another way of posing the question.
Some insist that e-carceration is “a step in the right direction.” But where are we going with this? A growing number of scholars and activists predict that “e-gentrification” is where we’re headed as entire communities become trapped in digital prisons that keep them locked out of neighborhoods where jobs and opportunity can be found.
If that scenario sounds far-fetched, keep in mind that mass incarceration itself was unimaginable just 40 years ago and that it was born partly out of well-intentioned reforms — chief among them mandatory sentencing laws that liberal proponents predicted would reduce racial disparities in sentencing. While those laws may have looked good on paper, they were passed within a political climate that was overwhelmingly hostile and punitive toward poor people and people of color, resulting in a prison-building boom, an increase in racial and class disparities in sentencing, and a quintupling of the incarcerated population.
November 9, 2018 in Collateral consequences, Procedure and Proof at Sentencing, Purposes of Punishment and Sentencing, Race, Class, and Gender, Technocorrections | Permalink | Comments (3)