Not Ready For Prime Time: UK Law Enforcement Facial Recognition Software Producing Tons Of False Positives [Updated]

from the Citizen-Suspect dept

Law enforcement agencies have embraced facial recognition. And contractors have returned the embrace, offering up a variety of “solutions” that are long on promise, but short on accuracy. That hasn’t stopped the mutual attraction, as government agencies are apparently willing to sacrifice people’s lives and freedom during these extended beta tests.

The latest example of widespread failure comes from the UK, where the government’s embrace of surveillance equipment far exceeds that of the United States. Matt Burgess of Wired obtained documents detailing the South Wales Police’s deployment of automated facial recognition software. What’s shown in the FOI docs should worry everyone who isn’t part of UK law enforcement. (It should worry law enforcement as well, but strangely does not seem to bother them.)

During the UEFA Champions League Final week in Wales last June, when the facial recognition cameras were used for the first time, there were 2,470 alerts of possible matches from the automated system. Of these 2,297 turned out to be false positives and 173 were correctly identified – 92 per cent of matches were incorrect.

That’s the most gaudy number returned in response to the records request. But the other numbers — even though they contain smaller sample sets — are just as terrible. The following table comes from the South Wales Police FOI response [PDF]:

In all but three cases, the number of false positives outnumbered positive hits. (And in one of those cases, it was a 0-0 tie.) The police blame the 2,300 false positives on garbage intake.

A spokesperson for the force blamed the low quality of images in its database and the fact that it was the first time the system had been used.

The company behind the tech insists this is an end user problem.

The company behind the facial recognition system, NEC, told ZDNet last year that large watchlists lead to a high number of false positives.

And it illustrates this with a highly-questionable analogy.

“We don’t notice it, we don’t see millions of people in one shot … but how many times have people walked down the street following somebody that they thought was somebody they knew, only to find it isn’t that person?” NEC Europe head of Global Face Recognition Solutions Chris de Silva told ZDNet in October.

I think most people who see someone they think they know might wave or say “Hi,” but only the weirdest will follow them around attempting to determine if they are who they think they are. Even if everyone’s a proto-stalker like NEC’s front man seems to think, the worst that could happen is an awkward (and short) conversation. The worst case scenario for false positives triggered by law enforcement software is some time in jail and an arrest record. The personal stake for citizens wrongly identified is not even comparable using de Silva’s analogy.

If large watchlists are the problem, UK law enforcement is actively seeking to make it worse. Wired reports the South Wales Police are looking forward to adding the Police National Database (19 million images) to its watchlist, along with others like drivers license data stores.

No matter what the real issue is here, the South Wales Police believe there are no adverse effects to rolling out facial recognition tech that’s wrong far more often than it’s right. It states it has yet to perform a false arrest based on bogus hits, but its privacy assessment shows it’s not all that concerned about the people swept up by poorly-performing software.

South Wales Police, in its privacy assessment of the technology, says it is a “significant advantage” that no “co-operation” is required from a person.

Sure, it’s an “advantage,” but one that solely serves law enforcement. It allows them to gather garbage images and run them against watchlists while hoping the false hits won’t result in the violation of an innocent person’s rights. But that’s all they have: hope. The tech isn’t ready for deployment. But it has been deployed and UK citizens are the beta testing group.

So, it will come as an unpleasant non-surprise that Axon (Taser’s body cam spinoff) is looking to add facial recognition tech to cameras officers are supposed to deploy only in certain circumstances. This addition will repurpose them into always-on surveillance devices, gathering up faces with the same efficiency as their automated license plate readers. False positives will continue to be a problem and deployment will scale far faster than tech advancements.

UPDATE: Axon apparently takes issue with the final paragraph of this post. It has demanded a correction to remove an unspecified “error” and to smooth the corners off some “bold claims.” Here’s Axon’s full statement:

At this point in time, we are not working on facial recognition technology to be deployed on body cameras. While we do see the value in this future capability, we also appreciate the concerns around privacy rights and the risks associated with misidentification of individuals. Accordingly, we have chosen to first form an AI Ethics Board to help ensure we balance both the risks and the benefits of deploying this technology. At Axon we are committed to ensuring that the technology we develop makes the world a better, and a safer place.

If there’s anything to be disputed in the last paragraph of the post, it might be “looking to add facial recognition tech to its cameras.” But more than one source (including the one linked in the paragraph) make the same claim about Axon looking at the possibility of adding this tech to its body camera line, so while Axon may not be currently working on it, it appears to be something it is considering. The addition of an ethics board is certainly the right way to approach this issue and its privacy concerns, but Axon’s statement does not actually dispute the assertions I made in the post.

As for the rest of the paragraph, I will clarify that I did not mean Axon specifically will push for body cameras to become ALPRs but for faces. Axon likely won’t. But police departments will. If the tech is present, it will be used. And history shows the tech will be deployed aggressively under minimal oversight, with apologies and policies appearing only after some damage has been done. To be certain, accuracy will be improved as time goes on. But as the UK law enforcement efforts show, deployment will far outpace tech advancements, increasing the probability of wrongful arrests and detentions.

Filed Under: , , , ,

Rate this comment as insightful
Rate this comment as funny
You have rated this comment as insightful
You have rated this comment as funny
Flag this comment as abusive/trolling/spam
You have flagged this comment
The first word has already been claimed
The last word has already been claimed
Insightful Lightbulb icon Funny Laughing icon Abusive/trolling/spam Flag icon Insightful badge Lightbulb icon Funny badge Laughing icon Comments icon

Comments on “Not Ready For Prime Time: UK Law Enforcement Facial Recognition Software Producing Tons Of False Positives [Updated]”

Subscribe: RSS Leave a comment
30 Comments
Anonymous Coward says:

Let’s imagine you have a needle in a haystack and you are tasked to find it because it’s a murderous SOB. Let’s imagine you have a technology called, say, haycial recognition, that can split that haystack into two piles, while letting you know that the murder-needle is in the far smaller pile. This is a good technology DESPITE having a high false-positive rate, because it’s not being used to determine if something is a needle-what-kills, but to narrow the field of things to search through to find an already known death-kill-meanie-needle.

The false positives in this blatant non-analogy aren’t harming anything except the police budget, and in that case the technology is still saving more time and effort (and your taxes!) than would be spent without it.

If you want to shit on facial recognition, shit on the aspect that deserves it, namely the destruction of privacy and abuses it enables. False positives are pretty irrelevant.

Anonymous Coward says:

Re: Re:

Let’s imagine you have a needle in a haystack and you are tasked to find it because it’s a murderous SOB. Let’s imagine you have a technology called, say, haycial recognition, that can split that haystack into two piles, while letting you know that the murder-needle is in the far smaller pile.

You’re assuming no false negatives, and that’s probably not a valid assumption. The false negative rate seems unknowable. If police assume the murder-needle is not in the big pile but it actually is, that’s gotta be a bad thing.

The false positives in this blatant non-analogy aren’t harming anything except the police budget

Sure, and there’s no reason taxpayers should be concerned about that, right?

False positives are pretty irrelevant.

That depends highly on what the police do when the system indicates a match. I assume that at a minimum they are detaining people to determine their identity, and that alone rises to a level above "irrelevant".

Anonymous Coward says:

Re: Re: Re:

That depends highly on what the police do when the system indicates a match.

Being the UK,the first thing that they will likely do is have a close look at the indicated subject and the photos that they have of them. They will then look for an opportunity to approach and identify the person, maintaining formal politeness while doing so.

Re: says:

Re: Re:

Except of course, you dont. You have forgotten about the false negatives.

You have no idea if the system is giving you a true negative or a false negative. All you have done here is to check out all the positives to determine if they are true or negative.

In your analogy all that has been done here is to scan the haystack, check out a couple of thousand pieces of hay and assume that was is left of the haystack does not contain the needle.

Anonymous Coward says:

Re: Re:

As others have mentioned… you don’t actually know how many needles are in the haystack.

It could be none, it could be 20,000.

If you found 3 positives and 10 false positives out of 20,000 needles present, are you content that it’s a job well done? Or was it a seriously flawed POS?

We’ll never know how well the software truly works, but relying on the facial recognition software to do the job with the assumption that it’s better than not using the software – could be a major mistake.

Peter says:

How did they determine the true accuracy?

Did they determine that each of the scans was actually a correct negative and an incorrect negative, and not a false positive or true positive? Because of course just as important as ensuring they did not get a false positive on an innocent person was to ensure they did not get a ‘pass’ on a wanted person.

In other words, did they check that every single scan of every person was returning the correct result in order to determine the true accuracy of the system? I am going to go out on a limb here and say, did they bollocks.

Anonymous Coward says:

Re: How did they determine the true accuracy?

As almost every adult in the country has been photographed by the authorities, such as when they apply for a passport, driver’s license, or other government-issued ID, it should be technically possible to identify practcally every single person in the stadium, and that’s obviously not what’s happening here — but just give it time.

Anonymous Coward says:

Re: Re: How did they determine the true accuracy?

To make such a system work, it would need a periodic update because people and their faces do not remain unchanged over time and their “AI” software does a poor job of predicting these changes.

So – how often? Possibly, in the future, your vehicle may be disabled until you update your facial recognition profile … daily.

Anonymous Coward says:

Re: Re: Re: How did they determine the true accuracy?

Fingerprints also change over time, and that’s something that government agencies apparently don’t know about, or care about.

Fingerprints are not just used for finding ‘bad’ people, it’s your badge of personal identity that the government has on file that’s considered permanent (but really isn’t, at least not in the eyes of a computer).

Applying for US citizenship is one such example. If your current fingerprints don’t precisely computer-match those on file that were taken by the INS many years or decades ago, then you are essentially a non-person in the eyes of the law and are not eligible to apply for citizenship. Proving that you are indeed still the same person could be an expensive, lengthy, and time-consuming uphill legal battle.

So "what’s the big deal?" you might ask. Even for people who never plan to ever vote in an election, citizenship becomes an extremely important issue in a person’s old age, because US citizens can inherit a deceased spouse’s property tax-free, while non-citizen permanent residents ("green card" holders) are hit with a whopping 40% federal tax on all assets upon the death of a spouse.

But back to the subject of automatic facial recognition issues, its almost a certainty that it will end up being abused in some ways, or the system trusted more than it ever should be, such as with innocent people having to suffer because "the computer says so."

That One Guy (profile) says:

'More justifications for searches? Where do we sign?!'

(It should worry law enforcement as well, but strangely does not seem to bother them.)

If the UK police are even remotely similar to the US police then there’s nothing ‘strange’ about it. More false positives means more chances to search people and possibly find something incriminating they can use to boost the ‘look at all the criminals we’re finding!’ numbers.

That this would have massive negative impacts on privacy of everyone searched is a sacrifice they are valiantly willing to have the public pay.

Richard (profile) says:

Re: 'More justifications for searches? Where do we sign?!'

If the UK police are even remotely similar to the US police

Fortunately they aren’t, at least for now.

For one thing they don’t have the option of shooting first and asking questions afterwards.

Traditionally there have been some pretty good senior police officers in the UK.

eg John Alderson
https://www.theguardian.com/uk/2011/oct/11/john-alderson

Sadly they remain quite rare

Cowardly Lion says:

Re: Re: 'More justifications for searches? Where do we sign?!'

And then there was this monster…

http://www.gayinthe80s.com/2014/05/1986-politics-manchesters-chief-constable-james-anderton/

What with his notorious frothing hatred of “licentious dancing”, and anyone in general just out enjoying themselves, oh, and the £32k of public money (a fortune in the ’80s) he lavished on himself by putting in luxurious thick pile shag in his Old Trafford office… this dolt was universally loathed.

Anonymous Coward says:

Re: Re: 'More justifications for searches? Where do we sign?!'

I am in the UK – are you? If you are UK based you must have a bad memory or unfamiliar with UK police cock ups
Remember the de Menezes murder by the police?
Mis identification led to multiple close range head shots and a fatality by the Met (London police)

https://en.m.wikipedia.org/wiki/Death_of_Jean_Charles_de_Menezes

As is usual for these cases (innocent people execute by UK police) nobody in the police chain of command was jailed.

Anonymous Coward says:

Reid Technique going high tech

That multi-colored facial recognition van reminds me of the UK’s infamous television detector vans patrolling the streets that they tell the public can peer into houses and find unlicensed TV sets inside, prompting the residents caught this way to immediately fess up and pay the fine. Of course the claimed electronic wizardry was a lie, and the whole operation turned out to be just a form of Reid Technique applied to people who chose not to pay their TV tax.

Likewise with AFR claims, we should be naturally skeptical in case there is some degree of smoke and mirrors going on and they’re trying to fudge the test results or game the system in some way.

Also, in a stadium full of people getting scanned, there would likely be hundreds, and possibly even thousands of people who are wanted by the police for some reason, mostly on minor offenses like failing to appear in court or pay a fine. Apparently most of them got missed and walked out of the stadium undetected.

Ninja (profile) says:

The high false positive ratio would be less of a problem if we had proper law enforcement agents and procedures. By normal I mean: match detected, officers go to the person to confirm it visually, ask for IDs if needed and let the false positive go with a smile and politeness. This could be true in a few more civilized countries but we know that it may more often than not result in innocent deaths.

Facial recognition isn’t ready for prime time. Neither is law enforcement. Or rather, neither is humanity.

Anonymous Coward says:

Surprising optimism by Techdirt author

The worst case scenario for false positives triggered by law enforcement software is some time in jail and an arrest record.

Based on other recent reporting, including on Techdirt, of officers using grossly excessive force with no penalty, I’d think this sentence should be:

— The worst case scenario for false positives triggered by law enforcement software is some unnecessary fatalities when the police shoot an innocent individual misidentified by the software as a match to an "armed and dangerous" fugitive cop-killer. If American cops were involved, figure some bystanders will get caught in the cross-fire too.

Coyne Tibbets (profile) says:

Only a Total Ban Will Help

“If you put garbage in a computer nothing comes out but garbage. But this garbage, having passed through a very expensive machine, is somehow ennobled and none dare criticize it.” – Rory Bremner

Since none dare criticize the conclusions facial ID generates, more false arrests will follow…at least until the tech is finally banned everywhere and relegated into a remote volcano, forever.

Leave a Reply to Re: Cancel reply

Your email address will not be published. Required fields are marked *

Have a Techdirt Account? Sign in now. Want one? Register here

Comment Options:

Make this the or (get credits or sign in to see balance) what's this?

What's this?

Techdirt community members with Techdirt Credits can spotlight a comment as either the "First Word" or "Last Word" on a particular comment thread. Credits can be purchased at the Techdirt Insider Shop »

Follow Techdirt

Techdirt Daily Newsletter

Ctrl-Alt-Speech

A weekly news podcast from
Mike Masnick & Ben Whitelaw

Subscribe now to Ctrl-Alt-Speech »
Techdirt Deals
Techdirt Insider Discord
The latest chatter on the Techdirt Insider Discord channel...
Loading...