The Criminal Justice System Is Relying On Tech To Do Its Job And That's Just Going To Make Everything Worse

from the laying-the-groundwork-for-human-misery dept

The criminal justice system appears to be outsourcing a great deal of its work. On the law enforcement side, automatic license plate readers, facial recognition tech, and predictive policing have replaced beat cops walking the streets and patrolling the roads. Over on the judicial side, analytic software is helping make sentencing decisions. This is supposed to make the system better by removing bias and freeing up government personnel to handle more difficult duties algorithms can't handle.

As is the case with most things government, it works better in theory than in practice. ALPRs create massive databases of people's movements, accessible by a hundreds of law enforcement agencies subject to almost zero oversight. More is known about facial recognition's failures than its successes, due to inherent limitations that churn out false positives at an alarming rate. Predictive policing is the algorithmic generation of self-fulfilling prophecies, building on historical crime data to suggest future crimes will occur in high crime areas.

While the judicial side might seem more promising because it could prevent judges from acting on their biases when handing down sentences, the software can only offer guidance that can easily be ignored. That and the software introduces its own biases based on the data it's fed.

The logic for using such algorithmic tools is that if you can accurately predict criminal behavior, you can allocate resources accordingly, whether for rehabilitation or for prison sentences. In theory, it also reduces any bias influencing the process, because judges are making decisions on the basis of data-driven recommendations and not their gut.

You may have already spotted the problem. Modern-day risk assessment tools are often driven by algorithms trained on historical crime data.

As we’ve covered before, machine-learning algorithms use statistics to find patterns in data. So if you feed it historical crime data, it will pick out the patterns associated with crime. But those patterns are statistical correlations—nowhere near the same as causations. If an algorithm found, for example, that low income was correlated with high recidivism, it would leave you none the wiser about whether low income actually caused crime. But this is precisely what risk assessment tools do: they turn correlative insights into causal scoring mechanisms.

Correlation is not causation. Past performance is not indicative of future results. And an individual being sentenced is not the average of 20 years of historical crime data. The software may steer judges away from personal biases, but it creates new ones to replace them. It's a lateral "improvement" that does little more than swap the inputs.

Once you've got a system brought up to speed on garbage in, the biases multiply and perpetuate. Sentencing decisions based on biased data generate more bad data for the sentencing software… which then leads to successively harsher sentences for the same criminal act with each iteration. As the recursive data rolls in, the sentencing recommendations will justify themselves because who can argue with raw data?

This is not to say tech should not be used by the criminal justice system. It's that it needs to be subject to rigorous oversight and its employers made aware of its limitations, including its innate ability to reinforce biases, rather than remove them. This isn't something to be taken lightly. The lives and liberties of Americans are literally at stake. Taking a hands-off approach to tech deployment is highly irresponsible, and it indicates those in power care very little about what happens to the people they serve.

Filed Under: algorithms, criminal justice, technology


Reader Comments

Subscribe: RSS

View by: Time | Thread


  • icon
    Stephen T. Stone (profile), 28 Jan 2019 @ 8:43pm

    Bold of you to assume those in power think they serve the people instead of the other way around.

    reply to this | link to this | view in chronology ]

  • identicon
    Pixelation, 28 Jan 2019 @ 8:53pm

    No morality

    "The Criminal Justice System Is Relying On Tech To Do Its Job And That's Just Going To Make Everything Cheaper"

    FTFY. Who cares if peoples lives are screwed over in the process.

    reply to this | link to this | view in chronology ]

    • icon
      JoeCool (profile), 29 Jan 2019 @ 5:46am

      Re: No morality

      Also remember that being cheaper DOESN'T mean it will cost less, it means the primary actors will pocket more of the tax-payers money.

      reply to this | link to this | view in chronology ]

  • icon
    Atkray (profile), 28 Jan 2019 @ 9:29pm

    Think of the judges

    It is easy to condemn potential flaws in this, but being a judge is not a simple as most people believe.

    I mean having a guaranteed job where you get to make your own rules and cannot be fired sounds great but but it is a very taxing position.

    Sure they have people to read everything for them and people to write up their opinions for them.

    But actually forming an opinion that doesn't reveal their inner weakness is bloody hard.

    Won't someone please think of the judges?

    reply to this | link to this | view in chronology ]

  • identicon
    Anonymous Coward, 28 Jan 2019 @ 9:30pm

    TAhis message brought to you by Dunkin....

    reply to this | link to this | view in chronology ]

  • identicon
    Anonymous Coward, 29 Jan 2019 @ 12:32am

    Prosecutorial BIAS IS the meat and potatoes for these kind. There will Never Not be BIAS.

    reply to this | link to this | view in chronology ]

  • identicon
    Anonymous Coward, 29 Jan 2019 @ 3:01am

    I'm reminded of...

    ...
    The recent Reply All two-parter on the CompStat system, and its history from its inception, to when cops began relying on it, and actively being directed by it not to do their jobs properly, as the horrendous flaws became apparent.

    https://www.gimletmedia.com/reply-all/127-the-crime-machine-part-i#episode-player

    https://ww w.gimletmedia.com/reply-all/128-the-crime-machine-part-ii#episode-player

    reply to this | link to this | view in chronology ]

  • identicon
    David, 29 Jan 2019 @ 3:21am

    Technology is great

    You train the computers with statistics about prejudice and its societal consequences, and they will give you what you want to hear with science on top. It's quite more authoritive if a computer shouts "only a dead nigger is a good nigger" and "hang them all and you can't go wrong".

    Predictive sentencing could pretend to make sense if jail time could be shown to prevent recidivism and jail could be considered a way to reform and reintegrate criminals: then it would be prevention. But punishing people in anticipation of what they might do makes no sense. Particularly not if jails are the entry card into more crime.

    reply to this | link to this | view in chronology ]

    • icon
      Scary Devil Monastery (profile), 31 Jan 2019 @ 2:52am

      Re: Technology is great

      "You train the computers with statistics about prejudice and its societal consequences, and they will give you what you want to hear with science on top."

      I'll see you and raise. The very second the algorithms and code used leaks what we have is suddenly a paradigm where organizaed criminals will know, to a T, where law enforcement is unlikely to be stationed and ready.

      I can, all too easily, envision a new type of criminal who specializes in selling this type of demographic map detailing where the boys in blue are least likely to be present, to any smart burglar with a bitcoin wallet.

      reply to this | link to this | view in chronology ]

  • identicon
    Glenn, 29 Jan 2019 @ 4:48am

    The ultimate "human error" ...taking the human out of the equation.

    reply to this | link to this | view in chronology ]

  • identicon
    Christenson, 29 Jan 2019 @ 7:24am

    Yet another example of moderation at scale...

    Remember Masnick's rule:

    Computers are *bad* at context.... so how is a computer gonna ask itself "is that right"???...when lives are on the line???...and the past history is full of wrongs!

    And never mind the highly relevant question of who serves who!

    reply to this | link to this | view in chronology ]

  • identicon
    Rekrul, 29 Jan 2019 @ 8:44am

    I think this is a great idea! In fact I think they should do away with grand juries in cases where a cop is accused of a crime and let an impartial computer program decide whether to serve an indictment based on the generic details entered into a form.

    Shooter: Police Officer
    Victim: Unarmed
    Result: Death

    Computer says: Indict!

    reply to this | link to this | view in chronology ]

  • icon
    Thad (profile), 29 Jan 2019 @ 8:53am

    Good piece on Ars: Yes, “algorithms” can be biased. Here’s why

    This, of course, shouldn't be any surprise to anyone who knows even a little bit about how computers work. Computers aren't magic; you can't give them biased data and get an unbiased result. Garbage in, garbage out.

    reply to this | link to this | view in chronology ]

    • icon
      Scary Devil Monastery (profile), 31 Jan 2019 @ 2:56am

      Re:

      "This, of course, shouldn't be any surprise to anyone who knows even a little bit about how computers work. "

      On two occasions I have been asked, 'Pray, Mr. Babbage, if you put into the machine wrong figures, will the right answers come out?' I am not able rightly to apprehend the kind of confusion of ideas that could provoke such a question.

        • Charles Babbage, inventor of the first mechanical calculator*

      Unfortunately politicians have a very long tradition of not understanding cause and consequence.

      reply to this | link to this | view in chronology ]

  • icon
    Mason Wheeler (profile), 29 Jan 2019 @ 10:55am

    The other difference between algorithms and people is that algorithms are much better at being corrected.

    Algorithms can learn something once and remember it throughout their "lifetime."

    Algorithms don't die and lose all their accumulated knowledge without passing it on.

    Algorithms don't have an ego that prevents them from accepting new and better information.

    Algorithms are still new to the job, so it's rather unfair to compare them humans with decades of experience. They'll get better over time, and they'll get better better than humans do.

    reply to this | link to this | view in chronology ]

    • identicon
      Valkor, 29 Jan 2019 @ 3:40pm

      Re:

      Algorithms are a tool, not a solution, and need to be used with caution and respect, kind of like a power tool.

      An easy example is Body Mass Index. It is useful for statistical analysis populations, but virtually worthless as a serious measure of individual health. The allure of it, like a sentencing algorithm, is that it's easy.

      reply to this | link to this | view in chronology ]


Add Your Comment

Have a Techdirt Account? Sign in now. Want one? Register here



Subscribe to the Techdirt Daily newsletter




Comment Options:

  • Use markdown. Use plain text.
  • Remember name/email/url (set a cookie)

Close

Add A Reply

Have a Techdirt Account? Sign in now. Want one? Register here



Subscribe to the Techdirt Daily newsletter




Comment Options:

  • Use markdown. Use plain text.
  • Remember name/email/url (set a cookie)

Follow Techdirt
Special Affiliate Offer

Advertisement
Report this ad  |  Hide Techdirt ads
Essential Reading
Techdirt Deals
Report this ad  |  Hide Techdirt ads
Techdirt Insider Chat
Advertisement
Report this ad  |  Hide Techdirt ads
Recent Stories
Advertisement
Report this ad  |  Hide Techdirt ads

Close

Email This

This feature is only available to registered users. Register or sign in to use it.