LAPD's Failed Predictive Policing Program The Latest COVID-19 Victim

from the [hastily-signs-DNR-certificate] dept

Fucking predictive policing/how the fuck does it work. Mostly, it doesn't. For the most part, predictive policing relies on garbage data generated by garbage cops, turning years of biased policing into "actionable intel" by laundering it through a bunch of proprietary algorithms.

More than half a decade ago, early-ish adopters were expressing skepticism about the tech's ability to suss out the next crime wave. For millions of dollars less, average cops could have pointed out hot crime spots on a map based on where they'd made arrests, while still coming nothing close to the reasonable suspicion needed to declare nearly everyone in a high crime area a criminal suspect.

The Los Angeles Police Department's history with the tech seems to indicate it should have dumped it years ago. The department has been using some form of the tech since 2007, but all it seems to be able to do is waste limited law enforcement resources to violate the rights of Los Angeles residents. The only explanations for the LAPD's continued use of this failed experiment are the sunk cost fallacy and its occasional use as a scapegoat for the department's biased policing.

Predictive policing is finally dead in Los Angeles. Activists didn't kill it. Neither did the LAPD's oversight. Logic did not finally prevail. For lack of a better phrase, it took an act of God {please see paragraph 97(b).2 for coverage limits} to kill a program that has produced little more than community distrust and civil rights lawsuits. Caroline Haskins has more details at BuzzFeed.

An LAPD memo dated April 15 quoted Police Chief Michel R. Moore saying that the police department would stop using the software, effective immediately, not because of concerns that activists have raised but because of financial constraints due to COVID-19, the disease caused by the novel coronavirus.

"The city's financial crisis, coupled with the impact of the COVID-19 pandemic, has resulted in the immediate freeze of new contractual agreements and 'belt-tightening' instructions by the Mayor to all city departments for all further expenditures," the memo said. "Therefore, the Department will immediately discontinue the use of PredPol and its associated reports."

Activists like Hamid Khan of the Stop LAPD Spying Coalition are calling this a win. And it is, sort of. When something you want stopped stops, it's still a victory, even if it appears to be due to unforeseeable developments rather than local activism. This doesn't mean Khan and others shouldn't keep working to keep LAPD's predpol system shut down. But it's perhaps too optimistic to declare this turn of events as a testament to your activism when it appears the LAPD is only temporarily mothballing a program its budget can't support at the moment.

But we can still hold out hope it won't be resurrected when the current crisis passes. The LAPD has struggled to show the program actually impacts criminal activity more than it does Constitutional rights, despite having more than a decade to do so. We can still celebrate its death, even if it's only being buried in effigy at this point. Maybe by the time this has all passed, the LAPD will realize it hasn't missed the expensive software's dubious contribution to the city's safety and abandon it for good.

Filed Under: algorithms, covid-19, lapd, law enforcement, police, predictive policing


Reader Comments

Subscribe: RSS

View by: Time | Thread


  • icon
    Koby (profile), 28 Apr 2020 @ 1:52pm

    Praise Objectivity

    One of the common complaints is that police exhibit a bias, so if police work can be directed without that bias, then I would have to praise that kind of goal. I think the attempt to feed information into a computer system that could objectively make decisions is an admirable one. Unfortunately, the concept of pre-crime analysis is beyond current technological capabilities. The coronavirus is going to be used as a cheap excuse to sweep this one under the rug. Back away slowly, and hope noone will notice.

    reply to this | link to this | view in chronology ]

    • identicon
      Anonymous Coward, 29 Apr 2020 @ 12:32am

      Re: Praise Objectivity

      There is an ancient phrase (by computer industry standards at least): "Garbage In/Garbage Out". Because police are the people who decide what goes in to the predatory policing database, their bias goes in at the same time. One of the most insidious things about predatory policing is the fact that simply interacting with police increases your chances of the system recommending more interactions. These new interactions pile up in the system and ensure increasing police scrutiny because the predatory policing database has an ever increasing record of police interactions that tell the system you are a person of ongoing interest to police and should therefore be subjected to increasing harassment.

      And it can all start with just a single instance of biased policing.

      reply to this | link to this | view in chronology ]

    • icon
      PaulT (profile), 29 Apr 2020 @ 4:23am

      Re: Praise Objectivity

      "I think the attempt to feed information into a computer system that could objectively make decisions is an admirable one."

      But, its' one that will never work, because whoever feeds the data into the computer and those that program it all have their own biases, sometimes unconsciously.

      The best analogy is the issues surrounding facial and voice recognition systems. There have been numerous complaints with both about how they treat minorities, either returning false positives in law enforcement use cases, or by having a higher error rate when dealing with personal systems. This is likely to be a implementation error than it is to be deliberate prejudice, but it's there because the people designing and testing systems are more likely to be white males and less likely to belong to the affected minority.

      So, it's quite likely that any such "pre-crime" system will actually reinforce systemic prejudice rather than prevent it. Combined with the tendency for people to believe whatever a computer tells them over the evidence of their own eyes (see: idiots driving into rivers because the GPS told them there was a bridge there even though they can see there's not one there), it's a recipe for disaster without oversight - and the overseers are going to be the ones with the bias problem you're trying to solve.

      reply to this | link to this | view in chronology ]

  • identicon
    Anonymous Coward, 28 Apr 2020 @ 1:59pm

    garbage in == garbage out

    reply to this | link to this | view in chronology ]

  • icon
    Norahc (profile), 28 Apr 2020 @ 2:54pm

    Q: Want to really use predictive policing to determine where crimes will occur?

    A: Just program in the officer's patrol route for the shift.

    reply to this | link to this | view in chronology ]

  • identicon
    Bobvious, 28 Apr 2020 @ 4:46pm

    Too much Minority Report perhaps?

    reply to this | link to this | view in chronology ]

  • identicon
    Pixelation, 28 Apr 2020 @ 5:09pm

    They should have named the software, "Probable Cause".

    reply to this | link to this | view in chronology ]

  • identicon
    Anonymous Coward, 28 Apr 2020 @ 8:44pm

    Cut the Police!

    I guess we have the answer for how to get out rights back. Cut police funding until they stop.

    reply to this | link to this | view in chronology ]

  • icon
    gbwa (profile), 29 Apr 2020 @ 1:53am

    LAPD

    This is sort of a catch-22. The LAPD would be wasting its resources if they didn't focus on areas where crime is more likely to occur - a strategy that patrolled all neighbourhoods equally (with the same coverage they currently apply to high-crime areas) would simply be unaffordable. Same goes for repeat offenders. The data on recidivism rates is pretty disheartening, but they're just stats - far and away the people most likely to commit crimes are those who have committed them in the past. A population-wide policing strategy should then, naturally, focus on areas with high crime rates and on repeat offenders to maximize return for policing dollar. https://www.gbwaplus.com/ The issue, of course, is that a disproportionate number of these areas are predominantly black or hispanic. This reflects a deeper, pretty messed-up socioeconomic issue not unique to LA but certainly highly visible there.

    reply to this | link to this | view in chronology ]

  • identicon
    Anonymous Coward, 29 Apr 2020 @ 6:13am

    What many police officers want is an objective, unbiased system that always agrees with them. Achieving one out of three is not bad, but it is hard to justify with the current economy.

    reply to this | link to this | view in chronology ]

  • identicon
    me, 29 Apr 2020 @ 5:52pm

    Monkey See....

    "Maybe by the time this has all passed, the LAPD will realize it hasn't missed the expensive software's dubious contribution to the city's safety and abandon it for good."

    And we can also hope that the LAPD's coming experience without this albatross around their neck, may help other PD's around the globe decide to do likewise, and end their own similarly wasted expenses.

    Hey. We can hope right.

    reply to this | link to this | view in chronology ]

  • identicon
    Anonymous Coward, 1 May 2020 @ 12:19pm

    It's Weird ...

    In Area A, where many criminals live, work, and victimize people, actual police officers interact with actual criminals. The police officers in this area go from call to call to call. Police are extremely busy in these areas; this data into a computer system.

    In Area B, where mostly law-abiding people live and work, police officers don't interact much with actual criminals. They spend most of their time on traffic violations and responding to minor calls like noise complaints. Police aren't very busy in these areas; this data goes into the computer system.

    Why does anyone think it's a big mystery that the computer system tells police to spend time and resources in Area A? Why is this confusing? More to the point, why is it controversial ?

    Obviously the computer system is a waste of taxpayer funds … because everyone in that city knows where the Area As and Area Bs are. The cops don't need an expensive computer system to tell them where crimes are happening and going to happen.

    Now, if Cushing, Kahn, and most Techdirt commenters happen to notice a racial difference between the people in Area As and Area Bs, that's their problem, not the police department's. Reality is frequently uncomfortable.

    reply to this | link to this | view in chronology ]

  • identicon
    Anonymous Coward, 2 May 2020 @ 5:08pm

    hi

    reply to this | link to this | view in chronology ]


Add Your Comment

Have a Techdirt Account? Sign in now. Want one? Register here



Subscribe to the Techdirt Daily newsletter




Comment Options:

  • Use markdown. Use plain text.
  • Remember name/email/url (set a cookie)

Close

Add A Reply

Have a Techdirt Account? Sign in now. Want one? Register here



Subscribe to the Techdirt Daily newsletter




Comment Options:

  • Use markdown. Use plain text.
  • Remember name/email/url (set a cookie)

Follow Techdirt
Special Affiliate Offer

Advertisement
Report this ad  |  Hide Techdirt ads
Essential Reading
Techdirt Deals
Report this ad  |  Hide Techdirt ads
Techdirt Insider Chat
Advertisement
Report this ad  |  Hide Techdirt ads
Recent Stories
Advertisement
Report this ad  |  Hide Techdirt ads

This site, like most other sites on the web, uses cookies. For more information, see our privacy policy. Got it
Close

Email This

This feature is only available to registered users. Register or sign in to use it.