Skip to content
Politics & Current Affairs

A Judge Lets an App Help Him Decide a Wisconsin Man’s 6-Year Sentence

A case in which a judge used an undisclosed software algorithm to determine a defendant’s sentence has caught the interest of the U.S. Supreme Court.

Here’s what the U.S. Constitution’s 6th Amendment says about a defendant’s right to know who they’ve been accused by, and of what they’ve been accused:

In all criminal prosecutions, the accused shall enjoy the right to be confronted with the witnesses against him.

This right was later extended to state as well as federal prosecutions by the 14th amendment . In March, the U.S. Supreme Court asked the federal government to file a “friend of the court” brief looking into the case of Eric L. Loomis, the defendant in a recent Wisconsin case whose 6th-amendment right may have been violated in a chilling way.

A company called Equivant creates and sells what it describes on its web page as “Case Management and Decision Support Software for Courts, Attorneys, Supervision, and Inmate Classification.” Their mission in general is to provide data management to the justice system. “We do this through our deep domain knowledge, modern technologies, and expert services that help promote public and individual safety by informing decisions at every step,” says their About page. There’s a lot of data to track in a case, and as far as that goes, it’s easy to understand how Equivant’s products help coordinate relevant information for an often overwhelmed system.

In the Wisconsin case, the court used an Equivalent software product intended for jails called “Compas Classification.” The app offers the following features:

  • Classification – Decision Tree Primary Classification and Reclassification Instruments
  • NIJ Mental Health Screening & Brief Jail Mental Health Screening
  • PREA Assessment and After-Action Plan
  • Inmate Housing Management
  • Inmate Discipline
  • Gang Tracking
  • Inmate Programs Management
  • Inmate Complaint/Grievance Management
  • Reporting
  • Not explicitly mentioned in this list, but referred to in a couple of locations on the company page, is that Compas assesses an inmate’s degree of “risk.” The company touts its “Nationally Recognized Decision Tree Model”: “The classification instruments are fully automated with appropriate information displayed to answer each risk split. The classification officer can seamlessly add criminal history, disciplinary history alert flags and holds/detainers while conducting the classification.”

    In the case of an inmate, “risk” would apply to misbehavior. In a court context, it could be interpreted as an indicator of the likelihood of committing further crimes, or recidivism.


    The judge in the Loomis case used Compas in formulating a six-year jail sentence for Loomis after the prosecutor presented Compas-generated bar charts that he was likely to commit further crimes. (Loomis was on trial for eluding police while driving a car used in a shooting.) The prosecutor pointed out to the court that Compas revealed “a high risk of violence, high risk of recidivism, high pretrial risk.” When the judge informed Loomis that, “you’re identified, through the Compas assessment, as an individual who is a high risk to the community,” it became clear that Compas had helped the judge to his opinion of Loomis. Loomis appealed his case to the Wisconsin Supreme Court, which found for the lower court.

    The issue here is that no one outside of Equivant — including the accused — can know how compassion arrived at its unfavorable assessment of Loomis. The software uses a proprietary modeling algorithm for assessing risk, and the company has not and apparently will not reveal exactly how it does what it does. The algorithm is, after all, part of the secret sauce in a commercial product.

    But what this means is that Loomis doesn’t know why he’s been classified as being a recidivism risk. The 6th amendment ensures that a defendant knows the reasons behind an accusation, and the proprietary nature of commercial software such as Compas virtually guarantees that this can’t happen.


    As a dependence on software — and blind trust in its programmers — infiltrates our culture and, in particular, our legal systems, either some new method of transparency has to be found, or the 6th amendment’s protection has to be discarded.

    This is why the U.S. Supreme Court is interested in Loomis’ case. When Chief Justice John Roberts was asked in April by the president of Rensselaer Polytechnic Institute if he could envision a day “when smart machines, driven with artificial intelligences, will assist with courtroom fact-finding or, more controversially even, judicial decision-making?” he answered simply, “It’s a day that’s here,” adding, “and it’s putting a significant strain on how the judiciary goes about doing things.”


    Up Next