State Supreme Court Says Secret Software Used In Sentencing Determinations Not A Violation Of Due Process Rights
from the not-as-long-as-it's-used-perfectly-within-an-impossible-set-of-confines dept
An algorithm is deciding certain criminal defendants should spend more time in prison. And that determination can't be fully challenged because the code belongs to a private company which provides the software to the government.
Eric Loomis was determined to be a "high risk" defendant, based on something called a "COMPAS score." COMPAS -- Criminal Offender Management Profiling for Alternative Sanctions -- cranks out Presentence Investigation Reports for use in the courtroom, utilizing a number of factors to generate a score that lets judges know how likely the defendant is to re-offend.
The problems with this system are numerous. For one, the code is proprietary, so defendants aren't allowed to examine the factors that lead to this determination, unlike other sentencing guidelines created by the government, which are open to the public to examine.
Another problem is that the algorithm engages in demographic profiling -- generally considered to be a bad thing when it comes to determining criminal behavior.
Back in May ProPublica published an investigation into the risk-assessment software that found that the algorithms were racially biased. ProPublica looked at the scores given to white people and black people and then whether the predictions were correct (by looking at whether they actually committed or didn’t commit crimes); they found that in Broward County, Florida, which was using software from a company called Northpointe, black people were mislabeled with high scores and that white people were more likely to be mislabeled with low scores.
"Fits the profile" is the new "fits the description" -- something that seems predisposed to putting blacks behind bars more frequently and for longer periods of time. Eric Loomis tried to challenge his COMPAS score but got nowhere with it, as the math behind it is locked up by Northpointe, which claims giving a defendant access to its trade secrets would pose a serious risk to its profitability.
Loomis argued that not giving him access posed a serious risk to his freedom. Allowing Northpointe to keep its algorithm secret was a violation of his due process rights, as it presented an unchallengeable score that could be used to keep him locked up longer than the normal range for the criminal activity he was convicted for.
His case went up the ladder to the Wisconsin Supreme Court, which has found [PDF] that defendants being unable to fully challenge a sentencing determination isn't a Constitutional problem.
Ultimately, we conclude that if used properly, observing the limitations and cautions set forth herein, a circuit court's consideration of a COMPAS risk assessment at sentencing does not violate a defendant's right to due process.
We determine that because the circuit court explained that its consideration of the COMPAS risk scores was supported by other independent factors, its use was not determinative in deciding whether Loomis could be supervised safely and effectively in the community. Therefore, the circuit court did not erroneously exercise its discretion. We further conclude that the circuit court's consideration of the read-in charges was not an erroneous exercise of discretion because it employed recognized legal standards.
Accordingly, we affirm the order of the circuit court denying Loomis's motion for post-conviction relief requesting a resentencing hearing.
The downside of this decision is that Northpointe cannot be forced to hand over its algorithm for examination by criminal defendants. The upside is that the court has issues with using COMPAS scores to determine sentence lengths.
[T]he opinion comes with some interesting caveats about things judges need to keep in mind when using risk scores in sentencing decisions: The two most important factors they’re asked to keep in mind is that software has been found to be racially biased and that the software needs to be constantly monitored and updated with new information. (If you’re relying on data from five or ten years ago, it’s not going to be accurate.)
The court also notes in passing that the software was never intended to be used to determine sentence lengths. It was supposed to used by the Department of Corrections to assess risks posed by parolees or those requesting parole. But it does not go so far as to forbid the use of COMPAS scores in sentencing decisions. Nor does it suggest that opening up the algorithm for examination might bring much-needed transparency to the sentencing process. Instead, the Supreme Court says judges must walk a very fine line when utilizing COMPAS scores.
The queasiness that judges feel about algorithmic risk-assesment is reflected in the concurring opinion filed by Justice Patience Drake Roggensack. “Reliance would violate due process protections,” she writes. “Accordingly, I write to clarify our holding in the majority opinion: consideration of COMPAS is permissible; reliance on COMPAS for the sentence imposed is not permissible.”
Unless a whole lot of judicial explanation accompanies every sentencing decision utilizing a COMPAS score, it's going to be almost impossible for defendants to tell whether a judge has just "considered" Northpointe's presentence investigation reports… or "relied" on them. Any sentence not hitting the upper end of the software's recommendations could be viewed as mere "consideration," even if "reliance" might be a more accurate term.
Without being allowed to closely examine COMPAS scores, defendants still aren't being given a chance to challenge any erroneous information that might be included in these reports. The court's reluctance to fully endorse the use of the software in sentencing decisions is a step forward, but it still allows judges to hand down sentences based on secret formulas that have already shown a predilection for recommending longer sentences to certain demographic groups.