Proprietary Algorithms Are Being Used To Enhance Criminal Sentences And Preventing Defendants From Challenging Them

from the like-the-Coke-formula,-but-for-years-of-someone's-life dept

When law enforcement agencies want to know what people are up to, they no longer have to send officers out to walk a beat. It can all be done in-house, using as many data points as can be collected without a warrant. Multiple companies offer "pre-crime" databases for determining criminal activity "hot spots," which allow officers to make foregone conclusions based on what someone might do, rather than what they've actually done.

Not that's it doing much good. For all the time, money, and effort being put into it, the databases seem to be of little utility.

Many law enforcement agencies use software to predict potential crime hot spots, and the police in Kansas City, Mo., and other places have used data to identify potential criminals and to try to intervene.

[...]

In Chicago, where there has been a sharp rise in violent crime this year, the police have used an algorithm to compile a list of people most likely to shoot or be shot. Over Memorial Day weekend, when 64 people were shot in Chicago, the police said 50 of the victims were on that list.

So much for "intervention." Having a list of people who have a higher risk of being shot doesn't mean much when all it's used for is confirming the database's hunches. However, these same databases are being put to use in a much more functional way: determining sentence lengths for the criminals who have been arrested.

When Eric L. Loomis was sentenced for eluding the police in La Crosse, Wis., the judge told him he presented a “high risk” to the community and handed down a six-year prison term.

The judge said he had arrived at his sentencing decision in part because of Mr. Loomis’s rating on the Compas assessment, a secret algorithm used in the Wisconsin justice system to calculate the likelihood that someone will commit another crime.

We're locking up more people for more years based on criminal activity they'll no longer have the option of possibly performing. This is nothing new. Sentencing enhancement is based on a lot of factors, not all of them confined to proprietary databases. But what is new are the algorithms used to determine these sentence enhancements, most of which belong to private companies who are completely uninterested in sharing this crucial part of the equation with the public.

In Mr. Loomis' case, the software determined he would be likely to engage in further criminal activity in the future. A so-called "Compas score" -- provided by Northpointe Inc. -- resulted in a six-year sentence for eluding an officer and operating a vehicle without the owner's consent. His lawyer is challenging this sentence enhancement and going after Northpointe, which refuses to release any information about how the Compas score is compiled.

What Northpointe has released are statements that confirm the code is proprietary and that the Compas score is "backed by research" -- although it is similarly unwilling to release this research.

The problem here isn't so much the use of algorithms to determine sentence lengths. After all, state and federal guidelines for sentence lengths are used all of the time during sentencing, which includes factors such as the likelihood of future criminal activity. But these guidelines can be viewed by the public and are much more easily challenged in court.

The use of private contractors to provide input on sentencing renders the process opaque. Defendants can't adequately challenge sentence enhancements without knowing the details of the "score" being presented by prosecutors to judges. The algorithms' inner workings should either be made available to defendants upon request, or the "score" should be determined solely by government agencies, where the data and determining factors can be inspected by the public.

We're now in the unfortunate situation where companies are telling judges how long someone should be locked up -- using data which itself might be highly questionable. The feeling seems to be that if enough data is gathered, good things will happen. But as we can see from Chicago's implementation of this technology, the only thing it's done so far is add confirmation bias toetags to the ever-increasing number of bodies in the city's morgues.

The use of locked-down, proprietary code in sentencing is more of the same. It undermines the government's assertion that prison sentences are a form of rehabilitation and replaces it with the promise that criminal defendants will "do the time" so they can't "do the crime" -- all the while preventing those affected from challenging this determination.


Reader Comments

Subscribe: RSS

View by: Time | Thread


  • icon
    Anonymous Anonymous Coward (profile), 24 Jun 2016 @ 6:09pm

    Challenge-ability

    The government should never be allowed to use something that is unable to be challenged. Ever. For anything. We are the government, they are only elected or selected to represent us. We should be able to challenge each and every thing. Whenever and however we want. Any prohibition to such challenges is what should be prevented.

    reply to this | link to this | view in chronology ]

  • icon
    That Anonymous Coward (profile), 24 Jun 2016 @ 7:54pm

    We are a nation who has accepted secret courts & secret laws, why should we blink about secret scores?
    It'll NEVER happen to you or anyone who love because you are "Good People"(tm) and bad people deserve it.

    While not directly related there was a report by someone looking into one of the hot spot scoring things cops were using and while he wasn't a risk there was some history at the location where he lived, so that raised his risk assessment even though it happened long before he resided there.

    Our Magic Tiger Repelling Rock really works, but we can't explain it... even if allowing its use undermines the basics tenants of justice.

    reply to this | link to this | view in chronology ]

  • identicon
    Anonymous Coward, 24 Jun 2016 @ 8:20pm

    Mmmmm, Probability!

    "...software to predict potential crime hot spots..."

    Oooh! I started employing the sort of correlational, statistical models (ARIMA) that are useful for this kind of stuff about thirty years ago. Howzabout we apply it to the prediction of municipal and other jurisdictions most likely illegally to abuse the Constitutional rights of citizens up to and including murder?! Cool! Instant, new business model. Oh wait, all of them, never mind.

    reply to this | link to this | view in chronology ]

    • identicon
      Anonymous Coward, 24 Jun 2016 @ 8:28pm

      Re: Mmmmm, Probability!

      "Oh wait, all of them, never mind."

      Also, (almost?) zero paying customers sinks this as a business model.

      reply to this | link to this | view in chronology ]

  • icon
    Shane C (profile), 24 Jun 2016 @ 9:23pm

    Follow the money...

    Gee I wonder if any of the For-Profit prisons have a financial interest in this line of computational modeling?

    reply to this | link to this | view in chronology ]

    • identicon
      JustShutUpAndObey, 25 Jun 2016 @ 8:03am

      Re: Follow the money...

      Now, let's not be a cynic.
      I mean, a realist.

      reply to this | link to this | view in chronology ]

    • identicon
      Anonymous Coward, 25 Jun 2016 @ 8:34am

      Re: Follow the money...

      You got that right, which is why I purchased stock in the two prisons for profit companies a while ago. I do not foresee a drop in prison population; I foresee it expanding especially with the DOJ’s interest in searching all cell phones. The war on pot is going to become the war on copyright infringement, got to keep the prisons populated otherwise you will have too many unemployed guards.

      reply to this | link to this | view in chronology ]

      • identicon
        Anonymous Coward, 26 Jun 2016 @ 7:49am

        Re: Re: Follow the money...

        Copyright on pot?
        Patent maybe ... certainly trademark disputes, like for Panama Red, Acapulco Gold, Road Apple Red, etc

        reply to this | link to this | view in chronology ]

  • identicon
    Anonymous Coward, 24 Jun 2016 @ 9:44pm

    Capitalism

    "companies are telling judges how long someone should be locked up"

    USA! USA! USA! Capitalism eff yeah! Solves all problems even those annoying thinky thingys. Am I right? *high five*

    reply to this | link to this | view in chronology ]

    • identicon
      Anonymous Coward, 27 Jun 2016 @ 4:11am

      Re: Capitalism

      Has nothing to do with capitalism. It has everything to do with the government getting in bed with capitalists which it should not be doing. Socialism and eventually communism would be far worse because it always, and I mean always, leads to a very large, powerful authoritarian government. And guess what happens to dissenters? The stack of dead bodies that these systems have left behind in the last 100 years can be seen from the moon.

      reply to this | link to this | view in chronology ]

  • identicon
    anonymouse, 24 Jun 2016 @ 9:56pm

    Not proprietary

    Take a learning algorithm like TensorFlow. Feed it career criminal information with all the patterns of behavior, location, associations, etc... and out pops a similar paths of behavior and how often it has been seen. They won't be able to explain how the algorithm works with specifics to 1 person.

    reply to this | link to this | view in chronology ]

  • identicon
    David, 24 Jun 2016 @ 10:23pm

    I don't mind how the algorithms work

    But this is rubbish. You cannot serve extra time for crimes you did not commit and which you are not even suspected of having committed. What's next? Death penalty for high crime areas? Round up and execute everybody living there? Because statistics suggest that to be a good idea?

    reply to this | link to this | view in chronology ]

  • identicon
    Anonymous Coward, 25 Jun 2016 @ 1:46am

    Sounds like we are running for the Minority Report as fast as we can manage. Only Hollydud movies are not 'how tos', they are entertainment. As much as the politicians and cops would like to believe in it, it ain't real.

    reply to this | link to this | view in chronology ]

  • identicon
    Anonymous Coward, 25 Jun 2016 @ 1:59am

    Self-fulfilling prophecy

    What do you bet the algorithm points police to a specific neighborhood. When people are suddenly picked up and tossed in prison, citizens start protesting and the police, acting defensive becaise blue lives matter more. They start cracking down further until every neighborhood is a hot spot for the police department's own response.

    reply to this | link to this | view in chronology ]

  • icon
    frank87 (profile), 25 Jun 2016 @ 2:34am

    isn't public court a civil right?

    And I think this acrticle shows why.

    reply to this | link to this | view in chronology ]

  • identicon
    David, 25 Jun 2016 @ 2:42am

    Police are going to love this.

    Now the courts are adopting the kind of "he was one of those no-do-good niggers. If he wasn't actually guilty for this crime, I bet he was for some others. I say, hang them all and let God sort them out" logic underlying some formerly debatable forms of policing.

    Whatever happened to "do the crime, serve the time"? Why should criminals from a background not requiring breaking the law semi-regularly to survive get softer sentences?

    reply to this | link to this | view in chronology ]

    • identicon
      Shilling, 25 Jun 2016 @ 11:45am

      Re: Police are going to love this.

      The nytimes suggests it is common that these algorithms exclude race which in itself is preposterous as the more data you include the more accurate this algorithm would be. Too bad you generalise everyone with these algorithms and the outliers will be punished even more. I thought the three strike system (used in Wisconsin) would cover all this nonsense anyway but I guess money has to be made.

      reply to this | link to this | view in chronology ]

      • identicon
        Anonymous Coward, 26 Jun 2016 @ 7:53am

        Re: Re: Police are going to love this.

        Three strikes, bad idea just like the mandatory minimum sentencing.

        reply to this | link to this | view in chronology ]

        • identicon
          David, 27 Jun 2016 @ 12:27am

          Re: Re: Re: Police are going to love this.

          Since "sentence enhancing" is based on expected rather than past behavior, this is more like a "minus three strikes system".

          Minus three strikes, and you're in. Serve the time before the crime.

          reply to this | link to this | view in chronology ]

      • icon
        nasch (profile), 27 Jun 2016 @ 9:44am

        Re: Re: Police are going to love this.

        The nytimes suggests it is common that these algorithms exclude race which in itself is preposterous as the more data you include the more accurate this algorithm would be.

        Considering how segregated most cities are, it probably doesn't matter much.

        reply to this | link to this | view in chronology ]

  • identicon
    Anonymous Coward, 25 Jun 2016 @ 2:55am

    Compas doesn't even HAVE a database, the company just says "defendant isn't white, all those [N-words] are criminals, maximum sentence".

    Northpointe is HEAVILY invested in private prisons and wants as many people as possible to stay as long as possible, since they work them as unpaid slave labor.

    reply to this | link to this | view in chronology ]

    • identicon
      David, 26 Jun 2016 @ 12:27am

      Re:

      Compas doesn't even HAVE a database, the company just says "defendant isn't white, all those [N-words] are criminals, maximum sentence".

      That would be unconscionable. It's more like "given current conviction rates and terms, we are likely to fall below 90% prison occupation at the time his usual prison term would end, so we better extend it. Guilty, guilty, guilty, guilty! Do we need more solitary confinement bookings right now?" You can't just make numbers up.

      reply to this | link to this | view in chronology ]

  • identicon
    Anonymous Coward, 25 Jun 2016 @ 2:57am

    Northpointe therefore CANNOT show its calculations or software because there is none.

    They literally have people physically typing stuff in, no background process or database software at all.

    Total scam to increase population count at prisons they manage/own.

    reply to this | link to this | view in chronology ]

  • identicon
    Daydream, 25 Jun 2016 @ 3:53am

    Wait, if they can identify factors that lead to a high risk of reoffending...

    Let's assume this is true.
    Let's assume these companies have developed genuine algorithms, that can assess all of the factors affecting a person, and determine if they will be a criminal or continue to be a criminal.
    Let's assume that these algorithms, as a whole, tend towards high degrees of sensitivity and specificity. That they're good tests.

    If these algorithms know what combination of factors will make a person commit crimes, then they know what factors can be changed so that person will no longer commit crimes.
    To put it another way, the profiles they construct around convicts to determine if they're a risk to the community, can also be used in their rehabilitation.

    *checks the article again*

    Nope, I don't see any mention of these algorithms being used to recommend education and training, psychological counselling, employment assistance, community service, or any other form of rehabilitation that might ACTUALLY reduce the recidivism rate.
    Just...longer sentences. It's like they WANT to keep people in jail and use them for cheap labour for the rest of their lives.

    reply to this | link to this | view in chronology ]

    • identicon
      David, 25 Jun 2016 @ 4:07am

      Re: Wait, if they can identify factors that lead to a high risk of reoffending...

      Why would you want to reduce recidivism? It would be highly unfair to jail people longer for their recidivism potential and then work on their rehabilitation.

      They served the time for their chance at recidivism, now give them an honest and unbiased opportunity for earning the prison term they have already served in advance.

      reply to this | link to this | view in chronology ]

    • icon
      Richard (profile), 25 Jun 2016 @ 2:13pm

      Re: Wait, if they can identify factors that lead to a high risk of reoffending...

      I don't see any mention of these algorithms being used to recommend education and training, psychological counselling, employment assistance, community service, or any other form of rehabilitation that might ACTUALLY reduce the recidivism rate.

      When for profit prosons were first mooted in the UK the suggestion was made that they should be paid by results - ie some of the fees would be held back until the prisoner had been released and had not re-offended for a set period.

      That would mean that the market mechanisms would be working in our favour and the kind of algoritms you describe would be worth investing in.

      Crime, and the prison population, would fall.

      reply to this | link to this | view in chronology ]

  • icon
    Grockman (profile), 25 Jun 2016 @ 4:46am

    Welcome to the new NK!

    reply to this | link to this | view in chronology ]

  • identicon
    Anonymous Coward, 25 Jun 2016 @ 6:30am

    Private Prison Industry:
    Wow - these algorithms can accurately predict criminal activity, awesome - we're gonna be soooo rich!
    ... oh wait, but not OUR criminal activity - right??????

    Right? ... hello?

    reply to this | link to this | view in chronology ]

  • identicon
    Anonymous Coward, 25 Jun 2016 @ 6:32am

    I found the list they're using

    It's the Chicago Phonebook.

    reply to this | link to this | view in chronology ]

  • identicon
    Anonymous Coward, 25 Jun 2016 @ 7:39am

    life imitating fiction... it

    Minority Report was supposed to be an example of what 'NOT' to do

    reply to this | link to this | view in chronology ]

  • identicon
    Anonymous Coward, 25 Jun 2016 @ 7:54am

    life imitating fiction... in the worst way

    Minority Report was supposed to be an example of what 'NOT' to do. People being judgded on what a person might or might not do before they could even think of putting thought to action, seems like a loss of free will.
    Any other intervention that doesn't result in incarcerarion or death would be a welcome prospect.

    reply to this | link to this | view in chronology ]

  • identicon
    Pixelation, 25 Jun 2016 @ 7:55am

    Wondering

    Would a judge normally take testimony in sentencing? Does a defendant have the right to cross examine the person giving that testimony? What if that person refuses to give answers?

    reply to this | link to this | view in chronology ]

    • identicon
      David, 26 Jun 2016 @ 12:31am

      Re: Wondering

      "I am totally convinced that this person will commit something if left on the street. Do you want it on your conscience, dear jury?"
      I mean, the average citizen (and thus jury member) is queasily ok with Guantanamo. Same idea here.

      reply to this | link to this | view in chronology ]

  • icon
    Whatever (profile), 25 Jun 2016 @ 8:30am

    So the cops are using technology to improve things...

    and Techdirt is upset?

    If the police are able to predict crime hot spots, perhaps they can work to improve policing in the area, or work to change behaviors of the local such that crime drops. That would be true crime prevention.

    The deadpool style list is a little bit creepier, and a whole lot harder to turn into action. They can't add protection for every person who might be at risk. But knowing who might be killed might give them a better insight into what is going on in the darker circles of life.

    I guess you guys would prefer that the cops spend their time in school studying to be to flight civil rights lawyers. Seems to be a job requirement these days!

    reply to this | link to this | view in chronology ]

    • identicon
      Anonymous Coward, 28 Jun 2016 @ 11:30am

      Re: So the cops are using technology to improve things...

      So the cops are using technology to improve things...

      Then why not release the algorithm?

      What's the big fucking secret?

      reply to this | link to this | view in chronology ]

    • identicon
      Wendy Cockcroft, 29 Jun 2016 @ 5:55am

      Re: So the cops are using technology to improve things...

      Shouldn't they be working on a better understanding of the law instead of pleading ignorance when a defence lawyer pulls their casework apart in court?

      Due process is not an impediment to justice.

      reply to this | link to this | view in chronology ]

  • identicon
    Anonymous Coward, 25 Jun 2016 @ 9:03am

    If you can't do the math in the algorithm, you shouldn't be using it

    in any fashion that could be depriving someone of life liberty and property. Otherwise what you are looking at is a "cloud based" justice system, where any of the underlying systems can be surveilled or tweaked by the hosting party.

    It is not reasonable to assume that the robed golems have the faintest clue how even one of the underlying subsystems work.

    In my experience the aggregation of that much data almost always reveals practical solutions during the development cycle. If you understood the code, you'd be looking at managed approaches to social justice, not punitive approaches.

    Selling fear and death is easy. Teaching somebody how to observe entropy through code is a wholely different problem indeed. Those doing the former, outnumber those able to do the latter by at least a thousand to one.

    reply to this | link to this | view in chronology ]

  • identicon
    David, 25 Jun 2016 @ 9:18am

    By the way...

    Anybody else getting really jarred whenever reading that "Enhance" in the article title?

    Must be one of the most evil pieces of Newspeak I ever saw.

    reply to this | link to this | view in chronology ]

  • identicon
    Anonymous Coward, 25 Jun 2016 @ 1:01pm

    So what this tells me is Person of Interest went off the air because it's getting to close to reality now.

    reply to this | link to this | view in chronology ]

  • identicon
    Anonymous Coward, 25 Jun 2016 @ 2:14pm

    Does the correlation go up the more corrupt cops commit crimes and the citizens start losing patience over a failing system to take the law into their own hands.

    reply to this | link to this | view in chronology ]

  • identicon
    Anonymous Coward, 25 Jun 2016 @ 4:43pm

    I thought due process was a right. How can people be punished without the ability to challenge the prosecution and their methods? This case and the FBI's NIT both throw that right, right out the window and at least in the FBI cases the evidence secured was dumped for not showing how the NIT works.

    Did the defendant just have a court assigned attorney, that was already overworked?

    reply to this | link to this | view in chronology ]

  • identicon
    Mark Wing, 25 Jun 2016 @ 6:46pm

    Computer algorithms have been making value judgments about us as individuals for the last couple decades.

    reply to this | link to this | view in chronology ]

  • identicon
    Daydream, 26 Jun 2016 @ 12:44am

    Perhaps these algorithms should be used in hiring police officers.

    If these algorithms can so reliably determine which offenders are 'high risk' and therefore should be kept in prison longer, surely they can be used to determine which applicants to the police force are at 'high risk' of committing police brutality.

    Or, y'know, determine who's likely to blow the whistle on police brutality.

    reply to this | link to this | view in chronology ]

    • identicon
      David, 26 Jun 2016 @ 1:24am

      Re: Perhaps these algorithms should be used in hiring police officers.

      Uh, you are aware that both criteria are already used for selecting police officers? If you score too high on IQ tests, you are ineligible.

      That minimizes both the risk to get some pansy unwilling to employ unnecessary force as well as some whistleblower.

      reply to this | link to this | view in chronology ]

  • identicon
    Mark Wing, 26 Jun 2016 @ 3:00am

    If Color <> "White" Then
           Throw Book
    Else
            If Income = Low
                   Throw Book
            Else
                   Set Free
            End If
    End If

    reply to this | link to this | view in chronology ]

  • identicon
    David, 26 Jun 2016 @ 3:44am

    How this works in practice:

    Let's take some high-profile case, like moving classified and top secret information to a private mail server with dubious security, pretending to do this out of sloppy convenience but in reality in order to circumvent FOIA laws.

    Now the prospect is that the perpetrator considers herself mostly above the law, intends breaking it again and again, and aims for a power grab where she'll be able to duck most accountability

    Those are rather dire prospects for rehabilitation, so it seems that a doubling of the sentencing is called for and she'll have to serve two terms rather than a single one. In the land of the free and the home of the brave.

    reply to this | link to this | view in chronology ]

    • identicon
      Anonymous Coward, 26 Jun 2016 @ 4:43am

      Re: How this works in practice:

      "perpetrator considers herself mostly above the law"

      What if in reality she is? Let's say the organization investigating and prosecuting has foreseen two possible future outcomes. The first where they may be held more to account. The second where they have an long (one or two 4-year terms) ongoing "investigation" powerful enough to invert the power structure of the executive branch of the government placing them at the top.

      reply to this | link to this | view in chronology ]

  • identicon
    nygrump, 26 Jun 2016 @ 7:08am

    full gestapo

    secret no fly lists
    secret court computer programs
    secret no gun lists

    Just wait until the cashless society is enforced: NO MONEY FOR YOU!
    Generation Pod won't even blink.

    reply to this | link to this | view in chronology ]

  • identicon
    Anonymous Coward, 26 Jun 2016 @ 8:43pm

    Of those people on the list who were shot - how many of them were shot because they're on the list?

    That's really scary to think about.

    reply to this | link to this | view in chronology ]

  • identicon
    Anonymous Coward, 27 Jun 2016 @ 4:08am

    Does Northpointe own prisons?

    I wonder if these companies invest in prisons? A question a defense lawyer should be asking.

    reply to this | link to this | view in chronology ]

  • identicon
    SouthHill, 27 Jun 2016 @ 5:17pm

    Let's see. If you get an extra 5 years on your sentence because the algorithm says you're "high risk" to commit, say, a robbery, then you serve your time and get out and commit a robbery, do you get credit for the "pre-crime" sentence that you've already served?

    reply to this | link to this | view in chronology ]


Add Your Comment

Have a Techdirt Account? Sign in now. Want one? Register here
Get Techdirt’s Daily Email
Use markdown for basic formatting. HTML is no longer supported.
  Save me a cookie
Follow Techdirt
Techdirt Gear
Show Now: Takedown
Advertisement
Report this ad  |  Hide Techdirt ads
Essential Reading
Techdirt Deals
Report this ad  |  Hide Techdirt ads
Techdirt Insider Chat
Advertisement
Report this ad  |  Hide Techdirt ads
Recent Stories
Advertisement
Report this ad  |  Hide Techdirt ads

Close

Email This

This feature is only available to registered users. Register or sign in to use it.