Unfair Sentencing Algorithms in Pennsylvania

Prisoner in Handcuffs

A new study reports criminal sentencing algorithms that may have nothing to do with one’s risk of recidivism are being considered in sentencing guidelines in courts nationally.

ProPublica analyzed these so-called “risk assessments” involving the use of a program called COMPAS and discovered some 7,000 people arrested in one Florida county in 2013 and 2014 rated as “high risk.” Of those marked most likely to commit future violent crimes, only 20 percent actually did. Of those deemed generally likely to re-offend for minor misdemeanors, the assessments were more accurate, 61 percent.

The most troubling aspect, though, was the inherent disparities that seemed to have nothing to do with actual behavior. The system was:

  • More likely to falsely flag black defendants as future criminals, labeling them twice as likely to re-offend as white defendants
  • More likely to falsely mislabel white offenders as low-risk compared to black defendants

This was the same across the board, regardless of prior types of crimes committed. These results are deeply troubling (though not entirely surprising) to our Scranton criminal defense attorneys.

Recidivism risk is one of many factors involved in sentencing

Pennsylvania Code Chapter 303 outlines the state’s sentencing guidelines, with factors such as multiple prior convictions, certain offenses, and disposition at trial all having an effect. Recidivism scores are provided in an effort to influence the judicial decision in how much jail time and severity of punishment an offender should receive. One who is deemed a lesser threat may receive a lighter sentence and vice versa. Recidivism rates can certainly feel impartial (because they are based on algorithmic scores), but as the ProPublica analysis shows, they aren’t.

Yet even after these algorithms were exposed when the story first came out two years ago, they remain in place. Even when analysts tried to fix the system, they still were unable to achieve an across-the-board accuracy rate above 55 percent. They concluded it was mathematically impossible to be totally fair – or even fair at all above that rate. Their conclusion was that despite the fact there is a presumption that such algorithms are inherently less biased than human predictions, the opposite is true. Human predictions tend to be right with about 65 percent accuracy.

The algorithm used by COMPAS takes into account 137 factors, while humans used just 7. Yet in a study of 462 people, humans predicted recidivism with about as much accuracy as the COMPAS system.

Further, researchers found they could achieve roughly the same levels of accuracy as the COMPAS system by taking just two factors into account: age and number of prior convictions. Those were the two single biggest factors in whether someone was likely to commit another crime and also be caught and convicted again.

While we will continue to advocate for reform in the criminal justice system, these unfair sentencing algorithms remain a factor for the time being. A dedicated, experienced criminal defense attorney is the best tool you have to fight back against unfair sentencing.

Contact Scranton NEPA Lawyers
Mazzoni Valvano Szewczyk & Karam

Free Consultation. No Obligation. Fast Reply. Find out how we can help you.