LAPD's Failed Predictive Policing Program The Latest COVID-19 Victim

from the [hastily-signs-DNR-certificate] dept

Fucking predictive policing/how the fuck does it work. Mostly, it doesn’t. For the most part, predictive policing relies on garbage data generated by garbage cops, turning years of biased policing into “actionable intel” by laundering it through a bunch of proprietary algorithms.

More than half a decade ago, early-ish adopters were expressing skepticism about the tech’s ability to suss out the next crime wave. For millions of dollars less, average cops could have pointed out hot crime spots on a map based on where they’d made arrests, while still coming nothing close to the reasonable suspicion needed to declare nearly everyone in a high crime area a criminal suspect.

The Los Angeles Police Department’s history with the tech seems to indicate it should have dumped it years ago. The department has been using some form of the tech since 2007, but all it seems to be able to do is waste limited law enforcement resources to violate the rights of Los Angeles residents. The only explanations for the LAPD’s continued use of this failed experiment are the sunk cost fallacy and its occasional use as a scapegoat for the department’s biased policing.

Predictive policing is finally dead in Los Angeles. Activists didn’t kill it. Neither did the LAPD’s oversight. Logic did not finally prevail. For lack of a better phrase, it took an act of God {please see paragraph 97(b).2 for coverage limits} to kill a program that has produced little more than community distrust and civil rights lawsuits. Caroline Haskins has more details at BuzzFeed.

An LAPD memo dated April 15 quoted Police Chief Michel R. Moore saying that the police department would stop using the software, effective immediately, not because of concerns that activists have raised but because of financial constraints due to COVID-19, the disease caused by the novel coronavirus.

“The city’s financial crisis, coupled with the impact of the COVID-19 pandemic, has resulted in the immediate freeze of new contractual agreements and ‘belt-tightening’ instructions by the Mayor to all city departments for all further expenditures,” the memo said. “Therefore, the Department will immediately discontinue the use of PredPol and its associated reports.”

Activists like Hamid Khan of the Stop LAPD Spying Coalition are calling this a win. And it is, sort of. When something you want stopped stops, it’s still a victory, even if it appears to be due to unforeseeable developments rather than local activism. This doesn’t mean Khan and others shouldn’t keep working to keep LAPD’s predpol system shut down. But it’s perhaps too optimistic to declare this turn of events as a testament to your activism when it appears the LAPD is only temporarily mothballing a program its budget can’t support at the moment.

But we can still hold out hope it won’t be resurrected when the current crisis passes. The LAPD has struggled to show the program actually impacts criminal activity more than it does Constitutional rights, despite having more than a decade to do so. We can still celebrate its death, even if it’s only being buried in effigy at this point. Maybe by the time this has all passed, the LAPD will realize it hasn’t missed the expensive software’s dubious contribution to the city’s safety and abandon it for good.

Filed Under: , , , , ,

Rate this comment as insightful
Rate this comment as funny
You have rated this comment as insightful
You have rated this comment as funny
Flag this comment as abusive/trolling/spam
You have flagged this comment
The first word has already been claimed
The last word has already been claimed
Insightful Lightbulb icon Funny Laughing icon Abusive/trolling/spam Flag icon Insightful badge Lightbulb icon Funny badge Laughing icon Comments icon

Comments on “LAPD's Failed Predictive Policing Program The Latest COVID-19 Victim”

Subscribe: RSS Leave a comment
Koby (profile) says:

Praise Objectivity

One of the common complaints is that police exhibit a bias, so if police work can be directed without that bias, then I would have to praise that kind of goal. I think the attempt to feed information into a computer system that could objectively make decisions is an admirable one. Unfortunately, the concept of pre-crime analysis is beyond current technological capabilities. The coronavirus is going to be used as a cheap excuse to sweep this one under the rug. Back away slowly, and hope noone will notice.

Anonymous Coward says:

Re: Praise Objectivity

There is an ancient phrase (by computer industry standards at least): "Garbage In/Garbage Out". Because police are the people who decide what goes in to the predatory policing database, their bias goes in at the same time. One of the most insidious things about predatory policing is the fact that simply interacting with police increases your chances of the system recommending more interactions. These new interactions pile up in the system and ensure increasing police scrutiny because the predatory policing database has an ever increasing record of police interactions that tell the system you are a person of ongoing interest to police and should therefore be subjected to increasing harassment.

And it can all start with just a single instance of biased policing.

PaulT (profile) says:

Re: Praise Objectivity

"I think the attempt to feed information into a computer system that could objectively make decisions is an admirable one."

But, its’ one that will never work, because whoever feeds the data into the computer and those that program it all have their own biases, sometimes unconsciously.

The best analogy is the issues surrounding facial and voice recognition systems. There have been numerous complaints with both about how they treat minorities, either returning false positives in law enforcement use cases, or by having a higher error rate when dealing with personal systems. This is likely to be a implementation error than it is to be deliberate prejudice, but it’s there because the people designing and testing systems are more likely to be white males and less likely to belong to the affected minority.

So, it’s quite likely that any such "pre-crime" system will actually reinforce systemic prejudice rather than prevent it. Combined with the tendency for people to believe whatever a computer tells them over the evidence of their own eyes (see: idiots driving into rivers because the GPS told them there was a bridge there even though they can see there’s not one there), it’s a recipe for disaster without oversight – and the overseers are going to be the ones with the bias problem you’re trying to solve.

gbwa (profile) says:


This is sort of a catch-22. The LAPD would be wasting its resources if they didn’t focus on areas where crime is more likely to occur – a strategy that patrolled all neighbourhoods equally (with the same coverage they currently apply to high-crime areas) would simply be unaffordable.

Same goes for repeat offenders. The data on recidivism rates is pretty disheartening, but they’re just stats – far and away the people most likely to commit crimes are those who have committed them in the past.

A population-wide policing strategy should then, naturally, focus on areas with high crime rates and on repeat offenders to maximize return for policing dollar.

The issue, of course, is that a disproportionate number of these areas are predominantly black or hispanic. This reflects a deeper, pretty messed-up socioeconomic issue not unique to LA but certainly highly visible there.

me says:

Monkey See....

"Maybe by the time this has all passed, the LAPD will realize it hasn’t missed the expensive software’s dubious contribution to the city’s safety and abandon it for good."

And we can also hope that the LAPD’s coming experience without this albatross around their neck, may help other PD’s around the globe decide to do likewise, and end their own similarly wasted expenses.

Hey. We can hope right.

Anonymous Coward says:

It's Weird ...

In Area A, where many criminals live, work, and victimize people, actual police officers interact with actual criminals. The police officers in this area go from call to call to call. Police are extremely busy in these areas; this data into a computer system.

In Area B, where mostly law-abiding people live and work, police officers don’t interact much with actual criminals. They spend most of their time on traffic violations and responding to minor calls like noise complaints. Police aren’t very busy in these areas; this data goes into the computer system.

Why does anyone think it’s a big mystery that the computer system tells police to spend time and resources in Area A? Why is this confusing? More to the point, why is it controversial ?

Obviously the computer system is a waste of taxpayer funds … because everyone in that city knows where the Area As and Area Bs are. The cops don’t need an expensive computer system to tell them where crimes are happening and going to happen.

Now, if Cushing, Kahn, and most Techdirt commenters happen to notice a racial difference between the people in Area As and Area Bs, that’s their problem, not the police department’s. Reality is frequently uncomfortable.

Add Your Comment

Your email address will not be published. Required fields are marked *

Have a Techdirt Account? Sign in now. Want one? Register here

Comment Options:

Make this the or (get credits or sign in to see balance) what's this?

What's this?

Techdirt community members with Techdirt Credits can spotlight a comment as either the "First Word" or "Last Word" on a particular comment thread. Credits can be purchased at the Techdirt Insider Shop »

Follow Techdirt

Techdirt Daily Newsletter

Techdirt Deals
Techdirt Insider Discord
The latest chatter on the Techdirt Insider Discord channel...