Hide Two days left! Support our fundraiser by January 5th and get the first Techdirt Commemorative Coin »

Predictive Policing Makes Everyone A Suspect, Even EU Officials

from the presumptive-policing-actually dept

We’ve long known so-called “predictive policing” is garbage. It’s the same old biased policing, except shinier and more expensive. Every system in place relies on data generated by policework — data instantly tainted by the things cops do, like hassling minorities, engaging in very selective enforcement, and treating people as inherently suspicious just because of where they live. These acts generate the garbage data that ensures that, when all the digital gears stop turning, more garbage data will be generated.

To highlight the worthlessness of predictive policing tools, criminal justice watchdog Fair Trials has mocked up an input tool of its own — one any site visitor can interact with and experience the faint horror of being prejudged a criminal by a set of seemingly innocuous questions.

Attend more than one school while growing up? Is your credit rating a bit too low? Have you ever witnessed a crime? Ever been the recipient of government benefits? Are you a minority? Ever spoken to the cops for any reason? Answer enough of these questions with a “yes” and you’ll head right up into “High Risk” territory — the sort of thing that tends to generate even more interactions with police officers utilizing predpol data… which then generates even more data ensuring you’ll remain on the “High Risk” list in perpetuity.

Sure, this is an extremely simplified version of software governments pay millions to purchase, but the risk factors presented are all used in predictive policing. And, as Fair Trials points out, these same sorts of systems are used by judges to determine bail amounts and sentence lengths — things that can be increased simply because a person has done nothing more than witness a crime or fallen behind on their bills.

Since it’s incredibly easy to rack up risk factors just by living your life, it’s no surprise even people with presumably the cleanest backgrounds can still find themselves listed among the troublesome by predictive policing algorithms. As Thomas Macaulay reports for The Next Web, Fair Trials’ predpol quiz has snagged a number of EU officials.

Politicians from the Socialists & Democrats, Renew, Greens/EFA, and the Left Group were invited to test the tool. After completing the quiz, MEPs Karen MelchiorCornelia ErnstTiemo WölkenPetar Vitanov, and Patrick Breyer were all identified as at “medium risk” of committing future crime.

As noted above, Fair Trials has presented a very simplified version of predictive policing software. But the questions used are very representative of how this software presents people to police officers, prosecutors, and judges. It takes a bunch of demographic data, conjures up networks of suspected criminals out of interactions, proximity and societal background, and spits out lists of high-risk people for cops to hassle. The end result is the laundering of biased policing via expensive black boxes that give the usual selective enforcement efforts a veneer of cold, hard science.

But underneath all the ones and zeroes, its basically still just cops going after poor people, minorities, foreigners, and anyone else perceived to be an easy target. Spending millions on proprietary algorithms doesn’t change a thing.

Filed Under: , , ,

Rate this comment as insightful
Rate this comment as funny
You have rated this comment as insightful
You have rated this comment as funny
Flag this comment as abusive/trolling/spam
You have flagged this comment
The first word has already been claimed
The last word has already been claimed
Insightful Lightbulb icon Funny Laughing icon Abusive/trolling/spam Flag icon Insightful badge Lightbulb icon Funny badge Laughing icon Comments icon

Comments on “Predictive Policing Makes Everyone A Suspect, Even EU Officials”

Subscribe: RSS Leave a comment
18 Comments
PaulT (profile) says:

“If you give me six lines written by the hand of the most honest of men, I will find something in them which will hang him.” – Cardinal Richelieu

I believe that’s from the 19th century author Alexandre Dumas’s depiction of him rather than something the real-life 17th century figure said himself, but these aren’t exactly new lessons being learned here.

This comment has been deemed insightful by the community.
That One Guy (profile) says:

'You trust it to judge them now how about judging you?'

Whether politician or police if you’re willing to risk ruining someone’s life by using a particular bit of tech and/or system you should have no problem running yourself through it just to be thorough and demonstrate your confidence in it’s accuracy, so if anything demonstrations like this should be mandatory and hold just as much legal weight as it would for a general member of the public.

This comment has been deemed funny by the community.
Anonymous Coward says:

Re:

I imagine most police officers would have no issue being tested by such an algorithm. Data provided by police departments across the country shows that members of police departments practically never commit crimes. And while there may be a few regrettable data points to the contrary which have unavoidably slipped into the data set, those are clearly the result of a few bad apples, which any reasonable algorithm would recognize as statistical outliers to be excluded from the analysis.

As such, a few simple risk factor questions like “Are you a member of law enforcement?” should be enough to correct any police officer’s Algorithmic Suspicion Score (TM) back down to zero where it belongs. That would prevent any unfortunate false positives, while still ensuring police departments can reliably identify the real bad actors through assessment of well-known threat-correlated factors, like economic hardship, membership of any frequently-arrested demographic group, or having expressed any negative opinion of law enforcement.

Opaque law enforcement algorithms are a clear net benefit to society, since they bring unprecedented speed and efficiency along with exactly the same effectiveness as many traditional targeting methods used by police. That includes ensuring that, regardless of your other behaviors, if you’re a cop, your A.S.S. (TM) is covered.

David says:

Predictive policing isn't garbage

It’s working with a priori probabilities. Which gives a better hit rate than randomness. Call it experience, call it bias, call it prejudice.

The problem is, we have an understanding of justice that relies on individual accountability for punishment and reward.

And punishment and reward are intended to act as incentives towards desirable behavior, so it’s particularly important to reward better than expected behavior and punish worse than expected behavior in order to facilitate change to the better.

“Predictive policing” is dangerous because it may stifle any breathing room required to allow change to the better.

The problem with self-fulfilling prophesies is not that they don’t work, but that they work. That means that you need to get away from making prediction accuracy a principal metric for policing. Not because it doesn’t work, but because the consequences of it working are counterproductive.

And that’s a very unpopular sell.

Anonymous Coward says:

As noted above, Fair Trials has presented a very simplified version of predictive policing software.

Not really THAT much simplified, sad to say. Change the weights, add a few more factors and you have million-dollar software.

Bonus points if you add in an audit trail, but that’s not necessary. After all, you Trust The Computer, don’t you?

Anon says:

As I learned...

As I learned almost 50 years ago (!) in Comp Sci –

GIGO – Garbage in, garbage out.

But do AI policing methods really apply to people – that only works when combined with facial rec I suppose (Orwell #2 scenario) so otherwise t applies to geography. Of course the areas with more crime will be predicted as having more crime. The areas predicted with more crime are the ones with population density, lower incomes, and (naturally) where the police have already found more crime. Police will then concentrate there. Result – major feed-back cycle validating the prediction and software, “Look! We found crime where we looked for crimes!”

Leave a Reply to Anonymous Coward Cancel reply

Your email address will not be published. Required fields are marked *

Have a Techdirt Account? Sign in now. Want one? Register here

Comment Options:

Make this the or (get credits or sign in to see balance) what's this?

What's this?

Techdirt community members with Techdirt Credits can spotlight a comment as either the "First Word" or "Last Word" on a particular comment thread. Credits can be purchased at the Techdirt Insider Shop »

Follow Techdirt

Techdirt Daily Newsletter

Subscribe to Our Newsletter

Get all our posts in your inbox with the Techdirt Daily Newsletter!

We don’t spam. Read our privacy policy for more info.

Ctrl-Alt-Speech

A weekly news podcast from
Mike Masnick & Ben Whitelaw

Subscribe now to Ctrl-Alt-Speech »
Techdirt needs your support! Get the first Techdirt Commemorative Coin with donations of $100
Techdirt Deals
Techdirt Insider Discord
The latest chatter on the Techdirt Insider Discord channel...
Loading...