Proprietary Algorithms Are Being Used To Enhance Criminal Sentences And Preventing Defendants From Challenging Them

from the like-the-Coke-formula,-but-for-years-of-someone's-life dept

When law enforcement agencies want to know what people are up to, they no longer have to send officers out to walk a beat. It can all be done in-house, using as many data points as can be collected without a warrant. Multiple companies offer “pre-crime” databases for determining criminal activity “hot spots,” which allow officers to make foregone conclusions based on what someone might do, rather than what they’ve actually done.

Not that’s it doing much good. For all the time, money, and effort being put into it, the databases seem to be of little utility.

Many law enforcement agencies use software to predict potential crime hot spots, and the police in Kansas City, Mo., and other places have used data to identify potential criminals and to try to intervene.

[…]

In Chicago, where there has been a sharp rise in violent crime this year, the police have used an algorithm to compile a list of people most likely to shoot or be shot. Over Memorial Day weekend, when 64 people were shot in Chicago, the police said 50 of the victims were on that list.

So much for “intervention.” Having a list of people who have a higher risk of being shot doesn’t mean much when all it’s used for is confirming the database’s hunches. However, these same databases are being put to use in a much more functional way: determining sentence lengths for the criminals who have been arrested.

When Eric L. Loomis was sentenced for eluding the police in La Crosse, Wis., the judge told him he presented a “high risk” to the community and handed down a six-year prison term.

The judge said he had arrived at his sentencing decision in part because of Mr. Loomis’s rating on the Compas assessment, a secret algorithm used in the Wisconsin justice system to calculate the likelihood that someone will commit another crime.

We’re locking up more people for more years based on criminal activity they’ll no longer have the option of possibly performing. This is nothing new. Sentencing enhancement is based on a lot of factors, not all of them confined to proprietary databases. But what is new are the algorithms used to determine these sentence enhancements, most of which belong to private companies who are completely uninterested in sharing this crucial part of the equation with the public.

In Mr. Loomis’ case, the software determined he would be likely to engage in further criminal activity in the future. A so-called “Compas score” — provided by Northpointe Inc. — resulted in a six-year sentence for eluding an officer and operating a vehicle without the owner’s consent. His lawyer is challenging this sentence enhancement and going after Northpointe, which refuses to release any information about how the Compas score is compiled.

What Northpointe has released are statements that confirm the code is proprietary and that the Compas score is “backed by research” — although it is similarly unwilling to release this research.

The problem here isn’t so much the use of algorithms to determine sentence lengths. After all, state and federal guidelines for sentence lengths are used all of the time during sentencing, which includes factors such as the likelihood of future criminal activity. But these guidelines can be viewed by the public and are much more easily challenged in court.

The use of private contractors to provide input on sentencing renders the process opaque. Defendants can’t adequately challenge sentence enhancements without knowing the details of the “score” being presented by prosecutors to judges. The algorithms’ inner workings should either be made available to defendants upon request, or the “score” should be determined solely by government agencies, where the data and determining factors can be inspected by the public.

We’re now in the unfortunate situation where companies are telling judges how long someone should be locked up — using data which itself might be highly questionable. The feeling seems to be that if enough data is gathered, good things will happen. But as we can see from Chicago’s implementation of this technology, the only thing it’s done so far is add confirmation bias toetags to the ever-increasing number of bodies in the city’s morgues.

The use of locked-down, proprietary code in sentencing is more of the same. It undermines the government’s assertion that prison sentences are a form of rehabilitation and replaces it with the promise that criminal defendants will “do the time” so they can’t “do the crime” — all the while preventing those affected from challenging this determination.

Filed Under: , , , , , ,
Companies: compas

Rate this comment as insightful
Rate this comment as funny
You have rated this comment as insightful
You have rated this comment as funny
Flag this comment as abusive/trolling/spam
You have flagged this comment
The first word has already been claimed
The last word has already been claimed
Insightful Lightbulb icon Funny Laughing icon Abusive/trolling/spam Flag icon Insightful badge Lightbulb icon Funny badge Laughing icon Comments icon

Comments on “Proprietary Algorithms Are Being Used To Enhance Criminal Sentences And Preventing Defendants From Challenging Them”

Subscribe: RSS Leave a comment
65 Comments
Anonymous Anonymous Coward says:

Challenge-ability

The government should never be allowed to use something that is unable to be challenged. Ever. For anything. We are the government, they are only elected or selected to represent us. We should be able to challenge each and every thing. Whenever and however we want. Any prohibition to such challenges is what should be prevented.

That Anonymous Coward (profile) says:

We are a nation who has accepted secret courts & secret laws, why should we blink about secret scores?
It’ll NEVER happen to you or anyone who love because you are “Good People”(tm) and bad people deserve it.

While not directly related there was a report by someone looking into one of the hot spot scoring things cops were using and while he wasn’t a risk there was some history at the location where he lived, so that raised his risk assessment even though it happened long before he resided there.

Our Magic Tiger Repelling Rock really works, but we can’t explain it… even if allowing its use undermines the basics tenants of justice.

Anonymous Coward says:

Mmmmm, Probability!

“…software to predict potential crime hot spots…”

Oooh! I started employing the sort of correlational, statistical models (ARIMA) that are useful for this kind of stuff about thirty years ago. Howzabout we apply it to the prediction of municipal and other jurisdictions most likely illegally to abuse the Constitutional rights of citizens up to and including murder?! Cool! Instant, new business model. Oh wait, all of them, never mind.

Anonymous Coward says:

Re: Follow the money...

You got that right, which is why I purchased stock in the two prisons for profit companies a while ago. I do not foresee a drop in prison population; I foresee it expanding especially with the DOJ’s interest in searching all cell phones. The war on pot is going to become the war on copyright infringement, got to keep the prisons populated otherwise you will have too many unemployed guards.

Anonymous Coward says:

Re: Capitalism

Has nothing to do with capitalism. It has everything to do with the government getting in bed with capitalists which it should not be doing. Socialism and eventually communism would be far worse because it always, and I mean always, leads to a very large, powerful authoritarian government. And guess what happens to dissenters? The stack of dead bodies that these systems have left behind in the last 100 years can be seen from the moon.

David says:

I don't mind how the algorithms work

But this is rubbish. You cannot serve extra time for crimes you did not commit and which you are not even suspected of having committed. What’s next? Death penalty for high crime areas? Round up and execute everybody living there? Because statistics suggest that to be a good idea?

Anonymous Coward says:

Self-fulfilling prophecy

What do you bet the algorithm points police to a specific neighborhood. When people are suddenly picked up and tossed in prison, citizens start protesting and the police, acting defensive becaise blue lives matter more. They start cracking down further until every neighborhood is a hot spot for the police department’s own response.

David says:

Police are going to love this.

Now the courts are adopting the kind of “he was one of those no-do-good niggers. If he wasn’t actually guilty for this crime, I bet he was for some others. I say, hang them all and let God sort them out” logic underlying some formerly debatable forms of policing.

Whatever happened to “do the crime, serve the time”? Why should criminals from a background not requiring breaking the law semi-regularly to survive get softer sentences?

Shilling says:

Re: Police are going to love this.

The nytimes suggests it is common that these algorithms exclude race which in itself is preposterous as the more data you include the more accurate this algorithm would be. Too bad you generalise everyone with these algorithms and the outliers will be punished even more. I thought the three strike system (used in Wisconsin) would cover all this nonsense anyway but I guess money has to be made.

Anonymous Coward says:

Compas doesn’t even HAVE a database, the company just says “defendant isn’t white, all those [N-words] are criminals, maximum sentence”.

Northpointe is HEAVILY invested in private prisons and wants as many people as possible to stay as long as possible, since they work them as unpaid slave labor.

David says:

Re: Re:

Compas doesn’t even HAVE a database, the company just says “defendant isn’t white, all those [N-words] are criminals, maximum sentence”.

That would be unconscionable. It’s more like “given current conviction rates and terms, we are likely to fall below 90% prison occupation at the time his usual prison term would end, so we better extend it. Guilty, guilty, guilty, guilty! Do we need more solitary confinement bookings right now?” You can’t just make numbers up.

Daydream says:

Wait, if they can identify factors that lead to a high risk of reoffending...

Let’s assume this is true.
Let’s assume these companies have developed genuine algorithms, that can assess all of the factors affecting a person, and determine if they will be a criminal or continue to be a criminal.
Let’s assume that these algorithms, as a whole, tend towards high degrees of sensitivity and specificity. That they’re good tests.

If these algorithms know what combination of factors will make a person commit crimes, then they know what factors can be changed so that person will no longer commit crimes.
To put it another way, the profiles they construct around convicts to determine if they’re a risk to the community, can also be used in their rehabilitation.

*checks the article again*

Nope, I don’t see any mention of these algorithms being used to recommend education and training, psychological counselling, employment assistance, community service, or any other form of rehabilitation that might ACTUALLY reduce the recidivism rate.
Just…longer sentences. It’s like they WANT to keep people in jail and use them for cheap labour for the rest of their lives.

David says:

Re: Wait, if they can identify factors that lead to a high risk of reoffending...

Why would you want to reduce recidivism? It would be highly unfair to jail people longer for their recidivism potential and then work on their rehabilitation.

They served the time for their chance at recidivism, now give them an honest and unbiased opportunity for earning the prison term they have already served in advance.

Richard (profile) says:

Re: Wait, if they can identify factors that lead to a high risk of reoffending...

I don’t see any mention of these algorithms being used to recommend education and training, psychological counselling, employment assistance, community service, or any other form of rehabilitation that might ACTUALLY reduce the recidivism rate.

When for profit prosons were first mooted in the UK the suggestion was made that they should be paid by results – ie some of the fees would be held back until the prisoner had been released and had not re-offended for a set period.

That would mean that the market mechanisms would be working in our favour and the kind of algoritms you describe would be worth investing in.

Crime, and the prison population, would fall.

Anonymous Coward says:

life imitating fiction... in the worst way

Minority Report was supposed to be an example of what ‘NOT’ to do. People being judgded on what a person might or might not do before they could even think of putting thought to action, seems like a loss of free will.
Any other intervention that doesn’t result in incarcerarion or death would be a welcome prospect.

Whatever says:

So the cops are using technology to improve things...

and Techdirt is upset?

If the police are able to predict crime hot spots, perhaps they can work to improve policing in the area, or work to change behaviors of the local such that crime drops. That would be true crime prevention.

The deadpool style list is a little bit creepier, and a whole lot harder to turn into action. They can’t add protection for every person who might be at risk. But knowing who might be killed might give them a better insight into what is going on in the darker circles of life.

I guess you guys would prefer that the cops spend their time in school studying to be to flight civil rights lawyers. Seems to be a job requirement these days!

Anonymous Coward says:

If you can't do the math in the algorithm, you shouldn't be using it

in any fashion that could be depriving someone of life liberty and property. Otherwise what you are looking at is a “cloud based” justice system, where any of the underlying systems can be surveilled or tweaked by the hosting party.

It is not reasonable to assume that the robed golems have the faintest clue how even one of the underlying subsystems work.

In my experience the aggregation of that much data almost always reveals practical solutions during the development cycle. If you understood the code, you’d be looking at managed approaches to social justice, not punitive approaches.

Selling fear and death is easy. Teaching somebody how to observe entropy through code is a wholely different problem indeed. Those doing the former, outnumber those able to do the latter by at least a thousand to one.

Anonymous Coward says:

I thought due process was a right. How can people be punished without the ability to challenge the prosecution and their methods? This case and the FBI’s NIT both throw that right, right out the window and at least in the FBI cases the evidence secured was dumped for not showing how the NIT works.

Did the defendant just have a court assigned attorney, that was already overworked?

Daydream says:

Perhaps these algorithms should be used in hiring police officers.

If these algorithms can so reliably determine which offenders are ‘high risk’ and therefore should be kept in prison longer, surely they can be used to determine which applicants to the police force are at ‘high risk’ of committing police brutality.

Or, y’know, determine who’s likely to blow the whistle on police brutality.

David says:

Re: Perhaps these algorithms should be used in hiring police officers.

Uh, you are aware that both criteria are already used for selecting police officers? If you score too high on IQ tests, you are ineligible.

That minimizes both the risk to get some pansy unwilling to employ unnecessary force as well as some whistleblower.

David says:

How this works in practice:

Let’s take some high-profile case, like moving classified and top secret information to a private mail server with dubious security, pretending to do this out of sloppy convenience but in reality in order to circumvent FOIA laws.

Now the prospect is that the perpetrator considers herself mostly above the law, intends breaking it again and again, and aims for a power grab where she’ll be able to duck most accountability

Those are rather dire prospects for rehabilitation, so it seems that a doubling of the sentencing is called for and she’ll have to serve two terms rather than a single one. In the land of the free and the home of the brave.

Anonymous Coward says:

Re: How this works in practice:

“perpetrator considers herself mostly above the law”

What if in reality she is? Let’s say the organization investigating and prosecuting has foreseen two possible future outcomes. The first where they may be held more to account. The second where they have an long (one or two 4-year terms) ongoing “investigation” powerful enough to invert the power structure of the executive branch of the government placing them at the top.

Add Your Comment

Your email address will not be published. Required fields are marked *

Have a Techdirt Account? Sign in now. Want one? Register here

Comment Options:

Make this the or (get credits or sign in to see balance) what's this?

What's this?

Techdirt community members with Techdirt Credits can spotlight a comment as either the "First Word" or "Last Word" on a particular comment thread. Credits can be purchased at the Techdirt Insider Shop »

Follow Techdirt

Techdirt Daily Newsletter

Ctrl-Alt-Speech

A weekly news podcast from
Mike Masnick & Ben Whitelaw

Subscribe now to Ctrl-Alt-Speech »
Techdirt Deals
Techdirt Insider Discord
The latest chatter on the Techdirt Insider Discord channel...
Loading...