University Of Chicago Researchers Think They’ve Built A Better Pre-Crime Mousetrap

from the better-guesswork-equals-better-policing? dept

Here are just two of the many things the Securities and Exchange Commission forbids investment companies from putting in their marketing literature:

(B) Representations implying that future gain or income may be inferred from or predicted based on past investment performance; or

(C) Portrayals of past performance, made in a manner which would imply that gains or income realized in the past would be repeated in the future.

No one’s policing police tech with as much zeal, because that’s basically the entirety of predictive policing programs: the assumption that past crime data can project where future crimes are likely to occur.

Predictive policing programs, for the most part, combine garbage data generated by biased policing efforts with proprietary software to generate “heat maps” or “area voted most likely to contain a future crime” or whatever to give law enforcement agencies guidance on how to best deploy their limited resources.

The problem isn’t necessarily the software. But even if it’s robust as fuck, it’s still going to reflect the bias inherent in the raw data. Areas where minorities live tend to be over-policed. Minorities are arrested at rates far exceeding their demographic percentage. Years of overt racism has created skewed data sets that over-represent victims of systemic bias. Predictions based on that data are only going to create more of the same of racist policing. But this time it will look like science, rather than cops rousting black kids just because they can.

Not only is predictive policing tech-based recycling of decades of bad ideas, it just never seems to result in the crime reduction and community-based policing advocates of these systems claim deployment will lead to.

Someone (well, several someones) claim they’ve finally gotten predictive policing right.

Scientists from the University of Chicago have developed a new algorithm that can predict future crime a week in advance with about 90% accuracy, and within a range of about 1000 feet.

It does so by learning patterns from public data on violent and property crimes.

“We report an approach to predict crime in cities at the level of individual events, with predictive accuracy far greater than has been achieved in past,” the authors write.

Sounds great, but what is really being celebrated here? This tool may tell cops what they already know (or believe), but it’s not really a solution. It suggests enforcement and patrols should be concentrated where crimes are likely to occur simply because that’s where crimes have occurred in the past. Being right 90% of the time doesn’t mean more crimes will be prevented. Nor does it mean more cases will be closed. Software with better accuracy can’t change how cops respond to crimes. It can only put a few more cops in certain areas and hope that this somehow produces positive results.

Besides the obvious problem of declaring an area to be the host of future crimes (making everyone in the area a possible suspect until a crime is committed), there’s the problem of bias introduced by the data set. These researchers claim they can mitigate this omnipresent problem of predictive policing.

Somehow this helps?

It divides the city into “spatial tiles” roughly 1,000 feet across, and predicts crime within these areas.

Previous models relied more on traditional neighborhood or political boundaries, which are subject to bias.

That may prevent snap judgments when heat maps are first seen, but that seems something better suited to, say, setting up Congressional districts than trying to prevent garbage data from generating garbage results. This only changes how the end results are displayed. It doesn’t somehow remove the bias from the underlying data.

And, for all its accuracy, the researchers acknowledged the improved software can’t really do much to reduce biased policing.

The research team also studied the police response to crime by analyzing the number of arrests following incidents, and comparing those rates among different neighborhoods

They found that when crime levels in wealthier areas increased, that resulted in more arrests. But this did not happen in disadvantaged neighborhoods, suggesting an imbalance in police response and enforcement.

But what if it wasn’t built for cops, but rather for the public and police oversight entities? Perhaps this is how the software should be used.

“We acknowledge the danger that powerful predictive tools place in the hands of over-zealous states in the name of civilian protection,” the authors conclude, “but here we demonstrate their unprecedented ability to audit enforcement biases and hold states accountable in ways inconceivable in the past.”

That sounds like a better use of predictive policing tech: tracking police enforcement activity rather than subjecting citizens to cops who treat everyone in a certain area like a suspect just because a computer told them criminal acts were in the forecast. But no government is willing to spend millions holding officers accountable or providing the public with better insight into law enforcement activities. Those millions have already been earmarked to buy cops more tech under the dubious assumption that past performance is indicative of future results.

Filed Under: , , ,

Rate this comment as insightful
Rate this comment as funny
You have rated this comment as insightful
You have rated this comment as funny
Flag this comment as abusive/trolling/spam
You have flagged this comment
The first word has already been claimed
The last word has already been claimed
Insightful Lightbulb icon Funny Laughing icon Abusive/trolling/spam Flag icon Insightful badge Lightbulb icon Funny badge Laughing icon Comments icon

Comments on “University Of Chicago Researchers Think They’ve Built A Better Pre-Crime Mousetrap”

Subscribe: RSS Leave a comment
Cop_Mocking_Guy says:

What could possibly go wrong...

Well I’m going to say that this will be a great idea once a non biased dataset with perfect historic accuracy exists. So basically never.

Garbage/racially and ethnically biased policing data is baked into this country with the police not being required to give the real stats. (And when its federally mandated its rarely enforced) It should be collected whether the police wants to or not regardless of how good or bad it makes them look.

We all know… The kind of da that an independent non profit third party organization would happily collect because it faces no direct repercussion over the findings its work produces. (Yeah wishful thinking on my part!)

Until then predictive policing tech and its proponents can do us a solid and go jump into the nearest active volcano so we don’t have to deal with their rent seeking pseudoscience.

This comment has been flagged by the community. Click here to show it.

Marti Luther King says:

It Doesn't Take Genius to Know Black Men are America's Biggest Criminals

What a bunch of woke, left-wing fruitcakes. It isn’t very hard to figure out where the violent crime is happening in Chicago. It’s always the same black neighborhoods. Black men are the biggest killers in America, which hack reporters and left-wing social activists claiming to be researchers can’t admit.

Lostinlodos (profile) says:

Re: Re:

Well, stereotypes don’t exist out of thin air.
It’s true fact that a black male 18-34 is more likely to rob you with a hand gun.

And white men 40-65 are more likely to be syphoning finds out of pensions and petty cash (actually it’s white women often going after open cash funds).

And parental religious oppression creates twisted kids that shoot bullies in schools. While their parents kill abortion doctors.

The majority of serial killers are middle aged white men. The majority of cyber criminals are under 20.

What’s the point

Rich (profile) says:

Source Data

Every time I read about some dunce claiming to have magic algorithms that can predict crime, I have to believe it’s based on more than just the history of previous police/public interaction. If it were that simple, there wouldn’t be so much care and attention to keeping all of these algorithms so secret. Then I think about all of the articles that I have read on this very website about the shady-as-shit data collection that goes on behind people’s backs, the mega data brokers, and the oft mentioned Third Party Doctrine, and I can’t help but connect the dots. Are these predictive police jizzbags using personal data/location/browsing history for this sort of shit? I realize that this particular article is about the University of Chicago, so I would like to think that their algorithms are not quite that evil, but as far as the other companies that peddle this garbage, I have zero doubt they would use anything they could get their grubby little mitts on. I read that in some areas, things like bail and parole are determined by this sort of super-secret, unimpeachable, and unchallengable algorithmic snake oil, which I would have thought would be terrifying enough to push everything else off of the ACLU’s to-do list until every last thread of this practice of determining the freedom of Americans based on data that is too secret to withstand public review were hunted, killed, and buried 6 feet deep with a proverbial stake through the heart, just to be safe. How is any of this shit legal?

Anonymous Coward says:

I know people who can predict future crime a week in advance with about 90% accuracy, and within a range of about 1000 feet — they’re called criminals with planning ability.

And am I the only one who, instead of immediately thinking Minority Report, thought psychohistory and Foundation?

Funny thing is, Isaac Asimov figured that pre-crime would never work due to bias, and you’d have to pan out to the generational level to really be able to predict trends in history. And again: as his books pointed out, all it takes is one variable set incorrectly and everything goes off track.

Lostinlodos (profile) says:

Re: Re:

Floyd was two separate issues.
One, two men with a long history came again into contact and one killed the other.
Two, a small group of sorry excuse of humanity stood by and did nothing while a man was killed.

The Floyd case has nothing to do with prediction. The police responded to a criminal report, the use of fraudulent money.

The real question for this is why did police respond? Counterfeit/fraudulent currency is under the secret service!

Lostinlodos (profile) says:

Re: Re: Re:4

A) it was NYT that I got the “multiple employees” from. It that is incorrect, has been modified, or has changed, I don’t know. Nor care.

B) I don’t subscribe to the fictional delusion that all, most, or even many, cops are bad. A man who happened to be a cop. Not ‘another cop’.

The problem is not police but police protectionism. Something seen in nearly every career. How many people did Vince McMahon fuck and then attempt to silence. How many did he rape? And the power cared not.
How many savings accounts did Taylor cross-stack in portfolios? And the powers care not.

Corrupted protectionism is the problem. Not the whole of law enforcement. Bad hiring practices, bad training.
And bad legal protections like “qualified immunity”. A premise that was meant to keep cops from being punished for shooting the duck head running at them with a knife. That slowly morphed to a shield that protected the tiny percentage of criminal cops who slaughtered bystanders chasing a man running away.

Lostinlodos (profile) says:

Re: Re: Re:6

Obviously, and you clearly care just as much about people.

My interest is no secret here. Family, friends, self. In that order.
There was a murder 2 days ago less than a half mile from my residence. There were 7 armed carjackings. Over 20 armed robberies, and two ope air shootouts in my town.

Do I care about a bad cop killing a criminal without cause. Yes. Am I going to follow every aspect of a story? In a town way off so far away? No.

Tanner Andrews (profile) says:

Re: Re: Re:5 incomplete diagnosis

The problem is not police but police protectionism.

Yeah, police protectionism goes with police.

It is generally spelled ``Fraternal Order of Police”, but sometimes the union has a different name. Almost always, though, it is to the same effect: they never saw a beating or killing by police where they would not defend the officer.

Tanner Andrews (profile) says:

So, How Come Is It

Not sure why technology moves backward, but the old site allowed you to flag spammy comments without needing javascript (firefox 78.4.0esr, linux 5.4.72-gentoo). New site does not. Are the folks at the new site unable to find the code from the old site so as to accomplish the same things we were doing years ago?

Lostinlodos (profile) says:


You sure do bounce around OSs. I thought I was bad.

The old site was, well, old. They had years of backend fixes to make it work.
The new site is—new.
Give it time.

I’ll say this, your Firefox is 16 releases behind LTS. I don’t believe it supports PHP 8.x at all. So here at least, it looks like your bumping into a fallback.

Send a bug report to TD and let them know. But this time I really do believe the problem is on your end.

Add Your Comment

Your email address will not be published. Required fields are marked *

Have a Techdirt Account? Sign in now. Want one? Register here

Comment Options:

Make this the or (get credits or sign in to see balance) what's this?

What's this?

Techdirt community members with Techdirt Credits can spotlight a comment as either the "First Word" or "Last Word" on a particular comment thread. Credits can be purchased at the Techdirt Insider Shop »

Follow Techdirt

Techdirt Daily Newsletter

Techdirt Deals
Techdirt Insider Discord
The latest chatter on the Techdirt Insider Discord channel...