Crime Predicting Software Falls Short

By Kristen Walsh

In 2013, a Wisconsin man plead guilty to attempting to flee an officer and no contest to operating a vehicle without the owner’s consent. Neither crime mandates prison time; however, he was sentenced to six years in prison. At sentencing, the judge cited as a factor for imprisonment his high risk of recidivism as predicted by COMPAS. When he challenged the use of the algorithm as a violation of his due process rights, the Wisconsin Supreme Court rejected his challenge and the Supreme Court of the United States declined to hear the case. When denying to hear the case, the Supreme Court effectively condoned the software’s use.

Correctional Offender Management Profiling for Alternative Sanctions (COMPAS) is a web-based tool designed to assess offender’ criminogenic needs and risk of recidivism. COMPAS is one of the most well known tools used by criminal justice agencies to inform decisions regarding the placement, supervision, and case management of offenders. It provides separate risk estimates for violence, recidivism, failure to appear, and community failure. COMPAS relies on variables that include criminal history, needs assessment, criminal attitudes, social environment, and additional factors such as socialization failure, criminal opportunity, criminal personality, and social support to assess risk.

Use of criminal justice algorithm software like COMPAS is widely used across the country to assist in the setting of bail, determining sentences, and even contributing to determinations about guilt or innocence. Recently, Science Advances published a study that found that randomly selected people could predict whether a criminal defendant would be convicted of a future crime with about 67% accuracy; an identical percentage as COMPAS. Also nearly identical were racial biases.

This software is playing a role in sending people to prison. Therefore, this software should be held to the highest standards or it should have limited use in the judicial system until its issues are worked out.

In order to fix the bias trends in the criminal justice system we must find the source of them and seek to change them. Identified as a major issue throughout the criminal justice system are racial biases. The intention of the COMPAS software is to help to reduce these biases. The software is supposed to interpret a new individuals data and compare it with previous data that it has to fairly determine a risk assessment in an impartial way.

Though well intentioned, this software was found to contain similar racial biases that are currently reflected in the judicial system. This news is troublesome, but it makes sense. The data that the software contains is full of the current biases of our criminal justice system. Therefore, when a new offender comes along and their data is compared to the data in the system, it will look for similarities and likely determine a similar outcome. Instead of creating new impartial and fair determinations, the same old biases are perpetuated.

The study found, specifically, compared to white offenders, a large share of black defendants were falsely predicted to reoffend. This is something that needs to be evaluated and changed before judges should even be able to consider COMPAS as a factor in their decision making process. Instead of fixing an issue, it is simply adding to it.

While it is important to introduce new technology into the legal field, a judge should not let the use of the software take away from their years of experience and individualized decision until it can be properly relied upon.

Student Bio: Kristen Walsh is a staff member of the Journal of High Technology. She is currently a 3L at Suffolk University Law School and holds a B.A. in Public Relations with minors in Legal Studies and Political Science from Roger Williams University.

Disclaimer: The views expressed in this blog are the views of the author alone and do not represent the views of JHTL or Suffolk University Law School.

Print Friendly, PDF & Email