Latest NSA Scoop: Scoping Surveillance is Hard


In Ferguson, Cameras a Focal Point

Criminal Justice

Big Data Sentences Could Undermine Fairness, Attorney General Argues

Using computerized predictions of future crime to sentence today’s offenders “may inadvertently undermine our efforts to ensure individualized and equal justice,” argued Attorney General Eric Holder last Friday, addressing a Philadelphia meeting of the National Association of Criminal Defense Lawyers. “By basing sentencing decisions on static factors and immutable characteristics – like the defendant’s education level, socioeconomic background, or neighborhood,” such policies “may exacerbate unwarranted and unjust disparities that are already far too common in our criminal justice system and in our society.”

Holder distinguished automated sentencing decisions from automated predictions regarding probation or parole, which he said have “for years, successfully aided in parole boards’ decision-making about candidates for early release. . . Data can also help design paths for federal inmates to lower these risk assessments, and earn their way towards a reduced sentence, based on participation in programs that research shows can dramatically improve the odds of successful reentry. Such evidence-based strategies show promise in allowing us to more effectively reduce recidivism.”

But as Philadelphia’s own experience shows, many of Holder’s sentencing concerns apply with similar force to probation and parole decisions. Philadelphia’s data-driven parole scoring system, developed together with researchers at the University of Pennsylvania and described in a federally-funded technical report, attempts to predict the likelihood of a serious offense after release, and imposes more stringent conditions on the release of inmates judged to be at higher risk. It relies heavily on factors that might be unfair to the individuals being judged:

  • The offender’s home zip code is a factor used predict recidivism risk. As a result, offenders from the wrong parts of town face greater scrutiny than others, when released into supervision.

  • The predictions in Philadelphia’s system are based on “criminal history data for Philadelphia alone,” ignoring offenses committed outside the city limits. As a result, the system is biased toward viewing long-time city residents (whose past contact with the legal system is included in its database) as relatively higher-risk, while artificially underestimating the risk levels for offenders whose past history took place in other jurisdictions.

  • The Philadelphia model uses “charge counts — as opposed to conviction counts — to represent each offender’s criminal history,” contradicting what the report’s authors call “a certain desire to structure supervision around what the offenders were convicted of in court, instead of the offenses that [they] were merely charged . . . with committing.” This puts the matter lightly. Particularly given the culture of plea bargaining, it is not unusual for defendants to be over-charged with crimes more serious than they’ve actually committed.

The predictive payoff from Philadelphia’s system is real, but far from perfect: less than 10% of the people released are charged with a serious offense within two years, but nearly 40% of those judged “high risk” are charged with one.

But even given these imperfections, risk-based assessments perform an important role. As Attorney General Holder observed, they have resulted in “reduced prison populations – and importantly, those reductions are disproportionately impacting men of color.”

Avoiding excessive punishment and treating individuals fairly to their individual circumstances are both important goals. And they are both achievable. The social justice community has a crucial role to play in making sure that neither is neglected.

We'd love to hear from you. Send us an email: