AI Isn't Making The Criminal Justice System Any Smarter

from the BIAS-2.0 dept

We’ve covered the increasing reliance on tech to replace human judgment in the criminal justice system, and the news just keeps getting worse. The job of sentencing is being turned over to software owned by private contractors, which puts a non-governmental party between defendants and challenges to sentence length.

The systems being used haven’t been rigorously tested and are as prone to bias as the humans they’re replacing. The system used by Washington, DC courts to sentence juvenile defendants hasn’t been examined ever, and yet it’s still being used to determine how long a person’s freedom should be taken away.

This system had been in place for 14 years before anyone challenged it. Defense lawyers found nothing that explained the court’s confidence in using it to sentence juveniles.

[I]n this particular case, the defense attorneys were able to get access to the questions used to administer the risk assessment as well as the methods of administering it. When they dug into the validity behind the system, they found only two studies of its efficacy, neither of which made the case for the system’s validity; one was 20 years old and the other was an unreviewed, unpublished Master’s thesis. The long-held assumption that the system had been rigorously validated turned out to be untrue, even though many lives were shaped due to its unproven determination of ‘risk’.

One system used in courts all over the nation is developed by Equivant (formerly Northpointe). It’s called COMPAS (Correctional Offender Management Profiling for Alternative Sanctions). COMPAS uses a set of questions to determine how much of the book is thrown at defendants, using data that only makes the United States’ carceral habits worse.

Northpointe’s core product is a set of scores derived from 137 questions that are either answered by defendants or pulled from criminal records. Race is not one of the questions. The survey asks defendants such things as: “Was one of your parents ever sent to jail or prison?” “How many of your friends/acquaintances are taking drugs illegally?” and “How often did you get in fights while at school?” The questionnaire also asks people to agree or disagree with statements such as “A hungry person has a right to steal” and “If people make me angry or lose my temper, I can be dangerous.”

The US locks up an alarming number of people every year and an alarming percentage of them are black. Feed this data into a system that wants to see if it’s locking up enough black people and the data will tell judges to keep hitting black people with longer sentences. It’s a feedback loop no one can escape from. Every new sentence using these calculations only adds more data telling the system it’s “right.”

Not only is the “improved” system introducing its own algorithmic biases, its proprietary biases are no better than those it’s replacing. This is how the system has been proven wrong repeatedly. It spits out lower recidivism risk scores for white defendants, only to have those defendants commit more crimes in the future than their black counterparts — even when black people arrested for the same criminal activity have been given considerably higher risk scores by COMPAS.

That’s not the only problem. Since it’s privately-owned, defense lawyers and researchers have been unable to examine the software itself. You may be able to challenge it based on sentencing data (if you can even manage to get that), but you won’t be able to attack the software itself because it wasn’t developed by the government.

Equivant doesn’t have to share its proprietary technology with the court. “The company that makes COMPAS has decided to seal some of the details of their algorithm, and you don’t know exactly how those scores are computed,” says Sharad Goel, a computer-science professor at Stanford University who researches criminal-sentencing tools. The result is something Kafkaesque: a jurisprudential system that doesn’t have to explain itself.

The new way gives us the same results as the old way. But it can’t be examined. It can only be questioned, and that’s not really getting anyone anywhere. A few sentences have been challenged, but every day it’s in use, COMPAS keeps generating sentences for “risky” defendants. And these sentences go right back into the database, confirming the software’s biases.

Filed Under: , ,
Companies: equivant, northpointe

Rate this comment as insightful
Rate this comment as funny
You have rated this comment as insightful
You have rated this comment as funny
Flag this comment as abusive/trolling/spam
You have flagged this comment
The first word has already been claimed
The last word has already been claimed
Insightful Lightbulb icon Funny Laughing icon Abusive/trolling/spam Flag icon Insightful badge Lightbulb icon Funny badge Laughing icon Comments icon

Comments on “AI Isn't Making The Criminal Justice System Any Smarter”

Subscribe: RSS Leave a comment
30 Comments
Anonymous Anonymous Coward (profile) says:

Sign here, no need to read the fine print.

I’d like to see the contract signed by the courts when they ‘purchase’ this software. Purchase is in quotes because it might be sold as a service and they, like others don’t actually own what they buy.

Somehow I think that the sales contract benefits the seller and prevents any kind of being able to qualify the product. That the courts, made up of lawyers (judges are lawyers…right?) didn’t and now don’t question the validity of the software seems somehow more political than judicial or judicious. Is this what we get from electing judges or are appointed judges just as political.

That a system of justice can allow software that cannot be proven to provide justice is considered to be just is an anathema. How is it that these questions have not reached a higher court and the entire concept shot down?

Anonymous Coward says:

Prisoners, modern slaves

Don’t forget that many prison systems are using the people there as slave labor, creating everything from your underwear to armor. It doesn’t matter if you are innocent of the crimes, or mentally incapable of being responsible for your actions, you are in their power and you will do as they command or be punished with extra time added to your sentence. People are profiting off of the children and adults forced into losing their rights. Programs like this are just making the corrupt get more power.

Harold Fck the only thing missing is 'osdi' says:

Re: Not really Artificial Intelligence. Nor "carceral"

Now, even if I share all your reserves, you must have some data that suggests intrinsic unfairness AND have some workable alternative before can really criticize.

Would you have the prior highly NON-uniform "system" of judges deciding, with personal spleen or prejudice, irritation from sitting on hemorrhoids all day, and so on? What’s your far better alternative, sonny?

Otherwise, this is just innuendo and lawyering (not necessarily inappropriate for particular cases when a big number turns up).

Any unfairness resulting is part of larger societal problems with ignoring petty crime and it constantly growing, evident daily right here on Techdirt, that some routinely advocate violating Rights of persons by stealing copyrighted works. As I’ve long said, anyone who pirates lousy entertainments has slipped the bounds of civilized society and heads toward a career of crime (those already far down that slippery slope will of course disagree); the longer without being caught, the further in excusing: making up entitlement, "not harming anyone", "natural right to copy", blaming the creators for not having a magical "new business model" which is immune to rampant theft, casting creators as Evil, and so on.

=====

Again required to break in two before accepted by the mighty "filter". Don’t blame me!

Anonymous Coward says:

Re: Re: Not really Artificial Intelligence. Nor "carceral&q

Would you have the prior highly NON-uniform "system" of judges deciding, with personal spleen or prejudice, irritation from sitting on hemorrhoids all day, and so on?

Than a black-box program? Absolutely. Judges obviously have their own flaws, some more than others. But replacing them with a secret formula is a step down, not a step up.

Anonymous Coward says:

Pretextual Prejudice as the Goal

Really the use of "AI" with sentencing has always seemed to not towards rehabilitation or fairness but as a pretext to smuggle in their prejudices. best illustrated by the UK algorithm which inexplicably took owned garden size (yard in the US sense) as a sentencing parameter. There is literally no reason to include that other than classism or tribal bias.

If they thought that yard work would be the crucial ingredient to rehabilitation they would have already included it in their prison system.

That Anonymous Coward (profile) says:

Everyone is worshiping at the altar of AI, it will be better, smarter, faster….
I just have one question…
What perfect person are they modeling the training from?

AI does away with all of the bias and flaws!!!
Except for those left in place by well meaning people who still can’t imagine themselves in someone elses shoes.

The AI is taught the ‘common wisdom’ that we all know…
Priests are good people – Uhhh

Gays are pedophiles – Nope, but the louder the family values guy screams it the more kids hes diddling.

Black people rob everyone – This has absolutely nothing to do with laws that specifically target them, that a white kid & a black kid doing the exact same things have drastically different outcomes.

There is no way to get the bias out of AI, we are tragically flawed teachers who can’t see the trees for the forest.

We can’t teach morals (points at the flaming failures of the moral majority & others) because we can’t follow them ourselves.

KILLING A CHILD IS HORRIBLE, EVERY LIFE IS SACRED!!!

EXECUTE THE PRISONER!!!!

uhhh what?

We want to pretend we have compassion for bad childhoods, broken homes, etc etc… but we really don’t. You can feel for the teen who was sexually abused, but you are then afraid to send the wrong message if you don’t sentence them to death for killing their abuser.

The best use of AI in the legal system would be making sure charges aren’t stacked to the sky to force a plea & to actually push prosecutors to do their jobs when it is clear the law was broken, even if it means some Union head screams at you on tv.

Another good use would be reviewing sentences, we can give it recidivism rates & all the data on people who were convicted of the same crimes & their sentences.

Hell AI could even look at QI cases and wonder if the Judges are brain damaged. AI could show us the flaws in cases with the same crimes but the different outcomes, when justice is supposed to be blind but clearly is peeking.

I would be more interest in seeing AI examine a history of cases & ask questions why the outcomes were different.
AI doesn’t care if the shooter was a cop, the AI cares that an armed person forced their way into a home & killed the person who lived there & would remind us what the charges should be, and the union head can scream all they want… but its a computer applying the law evenly in every case without worrying about blue flu & them throwing other cases.

David says:

So what are the alternatives?

How else are you going to justify different legal standards for people of different social class? The U.S. legal system allows rich persons to have poorer persons thrown in jail or ruin their lives financially by virtue of everybody having to pay their own legal costs proportionate to the legal competence of your representation, meaning that the financially underrepresented party will have to take a plea deal.

Without sentencing adjusted to the severity of one’s social standing, defendants may be tempted to scratch together the money for a competent legal defense and forego the plea deal without being suitably punished for their insubservience to the system.

This double standard allows for the peaceful coexistence of superior and inferior races. You don’t want to revert to the state where you have to kill all the Red Indians lest they ask for their land back or lynch the blacks because they don’t know their place in society and consider themselves entitled to equal rights as if.

TFG says:

Re: Re: Re:

I believe you have misinterpreted the analogy. The analogy does not compare Judges to unknown people in a black box.

Rather, the analogy takes the concept of a black box that spits out numbers, which is what we have now in a software format, and removes the software to replace it with a human element.

Society would not accept some unknown person, above all forms of redress, handing out sentences, but for some reason software is fine. That’s the point of the analogy.

Mark says:

Study cited in the article has been debunked

The study you are citing (Angwin et.al. from ProPublica) has been debunked. No bias could be shown to exist in the Northpointe software.

Source 1: "Algorithms in the Justice System: Some Statistical Issues", Royal Statistical Society (November 08, 2018) which calls the ProPublica study "ill-founded".
Source 2: Flores, Bechtel, Lowencamp; Federal Probation Journal, September 2016, "False Positives, False Negatives, and False Analyses: A Rejoinder to “Machine Bias: There’s Software Used Across the Country to Predict Future Criminals. And it’s Biased Against Blacks.”", URL http://www.uscourts.gov/statistics-reports/publications/federal-probation-journal/federal-probation-journal-september-2016
In fact the ProPublica analysis was so poorly done (bad sampling, bad analysis, basic statistical mistakes) that the authors wrote: "It is noteworthy that the ProPublica code of ethics advises investigative journalists that "when in doubt, ask" numerous times. We feel that Larson et al.’s (2016) omissions and mistakes could have been avoided had they just asked. Perhaps they might have even asked…a criminologist? We certainly respect the mission of ProPublica, which is to "practice and promote investigative journalism in the public interest." However, we also feel that the journalists at ProPublica strayed from their own code of ethics in that they did not present the facts accurately, their presentation of the existing literature was incomplete, and they failed to "ask." While we aren’t inferring that they had an agenda in writing their story, we believe that they are better equipped to report the research news, rather than attempt to make the research news."

That said people have tried fixing the criminal justice system for years. Sentencing guidelines, sensitivity training, whatever… When humans made the decisions the outcome apparently wasn’t much different and without bias. The algorithmic approaches have the advantage that they are consistent and (in theory) could be verifiable (how else do you get a large enough sample for each judge to test they are not biased). We can discuss the particular implementation (maybe COMPAS isn’t the best to use), but I do think using algorithms that optimize for a pre-determined outcome (say, no new offense once released) can lead to be a better way of handling sentencing.

Anonymous Coward says:

Re: Study cited in the article has been debunked

The attempt to replace judges with software is an ill fated venture.
What’s next … Milt screaming "Software is people my friend"?

Does this apply to serious cases? Probably not. It will probably be used in conjunction with some AI public defender software designed to make plea bargains run more efficiently thus providing for a full prison system enabling bonuses and dividends for a few careless individuals.

This will not bring any consistency to the multi tiered justice system where wealth equals immunity.

Anonymous Coward says:

Re: Study cited in the article has been debunked

Even if the black box isn’t spitting out biased results at this point in time — its inability to explain itself and the lack of accountability for any problems with it that might crop up in the future are both problems in their own right. If we are going to have a Sentence-O-Matic roaming around, it better darn well be something that is at least as transparent and accountable as the rest of our justice system…

Beta (profile) says:

[i]"The US locks up an alarming number of people every year and an alarming percentage of them are black. Feed this data into a system that wants to see if it’s locking up enough black people and the data will tell judges to keep hitting black people with longer sentences."[/i]

Umm… how do you know that? How do you know how "the system" will respond to such data? If you haven’t seen the code, you don’t know which way it’ll jump.

Anonymous Coward says:

Re: Re:

I appreciate your attempt at unbiased evaluation, however – history provides some guidance in the effort to prepare for future events.

What makes you think the powers presently in charge would allow unbiased sentencing?

If the AI is one that is supposed to learn … what will it learn from all the past sentencing that it is privy to?

Anonymous Coward says:

Who wrote the software? What do you think it does?

So if you were a Private Corporation that was able to charge whatever you wanted for services ($10.00/Minute for phone calls, $5.00 for a candy bar, etc), how would you design software for use in ‘filling’ your for profit prison?

  1. If bed available, release not advised at this time
  2. If no bed available, release not advised at this time, request extra funding for increase in available beds.
  3. Go To 1
  4. If made it to here, advise release at this time.

PROFIT ALL THE WAY TO THE BANK BABY… I’z an xclnt coder.

Add Your Comment

Your email address will not be published. Required fields are marked *

Have a Techdirt Account? Sign in now. Want one? Register here

Comment Options:

Make this the or (get credits or sign in to see balance) what's this?

What's this?

Techdirt community members with Techdirt Credits can spotlight a comment as either the "First Word" or "Last Word" on a particular comment thread. Credits can be purchased at the Techdirt Insider Shop »

Follow Techdirt

Techdirt Daily Newsletter

Ctrl-Alt-Speech

A weekly news podcast from
Mike Masnick & Ben Whitelaw

Subscribe now to Ctrl-Alt-Speech »
Techdirt Deals
Techdirt Insider Discord
The latest chatter on the Techdirt Insider Discord channel...
Loading...