Not Ready For Prime Time: UK Law Enforcement Facial Recognition Software Producing Tons Of False Positives [Updated]

from the Citizen-Suspect dept

Law enforcement agencies have embraced facial recognition. And contractors have returned the embrace, offering up a variety of "solutions" that are long on promise, but short on accuracy. That hasn't stopped the mutual attraction, as government agencies are apparently willing to sacrifice people's lives and freedom during these extended beta tests.

The latest example of widespread failure comes from the UK, where the government's embrace of surveillance equipment far exceeds that of the United States. Matt Burgess of Wired obtained documents detailing the South Wales Police's deployment of automated facial recognition software. What's shown in the FOI docs should worry everyone who isn't part of UK law enforcement. (It should worry law enforcement as well, but strangely does not seem to bother them.)

During the UEFA Champions League Final week in Wales last June, when the facial recognition cameras were used for the first time, there were 2,470 alerts of possible matches from the automated system. Of these 2,297 turned out to be false positives and 173 were correctly identified – 92 per cent of matches were incorrect.

That's the most gaudy number returned in response to the records request. But the other numbers -- even though they contain smaller sample sets -- are just as terrible. The following table comes from the South Wales Police FOI response [PDF]:

In all but three cases, the number of false positives outnumbered positive hits. (And in one of those cases, it was a 0-0 tie.) The police blame the 2,300 false positives on garbage intake.

A spokesperson for the force blamed the low quality of images in its database and the fact that it was the first time the system had been used.

The company behind the tech insists this is an end user problem.

The company behind the facial recognition system, NEC, told ZDNet last year that large watchlists lead to a high number of false positives.

And it illustrates this with a highly-questionable analogy.

"We don't notice it, we don't see millions of people in one shot ... but how many times have people walked down the street following somebody that they thought was somebody they knew, only to find it isn't that person?" NEC Europe head of Global Face Recognition Solutions Chris de Silva told ZDNet in October.

I think most people who see someone they think they know might wave or say "Hi," but only the weirdest will follow them around attempting to determine if they are who they think they are. Even if everyone's a proto-stalker like NEC's front man seems to think, the worst that could happen is an awkward (and short) conversation. The worst case scenario for false positives triggered by law enforcement software is some time in jail and an arrest record. The personal stake for citizens wrongly identified is not even comparable using de Silva's analogy.

If large watchlists are the problem, UK law enforcement is actively seeking to make it worse. Wired reports the South Wales Police are looking forward to adding the Police National Database (19 million images) to its watchlist, along with others like drivers license data stores.

No matter what the real issue is here, the South Wales Police believe there are no adverse effects to rolling out facial recognition tech that's wrong far more often than it's right. It states it has yet to perform a false arrest based on bogus hits, but its privacy assessment shows it's not all that concerned about the people swept up by poorly-performing software.

South Wales Police, in its privacy assessment of the technology, says it is a "significant advantage" that no "co-operation" is required from a person.

Sure, it's an "advantage," but one that solely serves law enforcement. It allows them to gather garbage images and run them against watchlists while hoping the false hits won't result in the violation of an innocent person's rights. But that's all they have: hope. The tech isn't ready for deployment. But it has been deployed and UK citizens are the beta testing group.

So, it will come as an unpleasant non-surprise that Axon (Taser's body cam spinoff) is looking to add facial recognition tech to cameras officers are supposed to deploy only in certain circumstances. This addition will repurpose them into always-on surveillance devices, gathering up faces with the same efficiency as their automated license plate readers. False positives will continue to be a problem and deployment will scale far faster than tech advancements.

UPDATE: Axon apparently takes issue with the final paragraph of this post. It has demanded a correction to remove an unspecified "error" and to smooth the corners off some "bold claims." Here's Axon's full statement:

At this point in time, we are not working on facial recognition technology to be deployed on body cameras. While we do see the value in this future capability, we also appreciate the concerns around privacy rights and the risks associated with misidentification of individuals. Accordingly, we have chosen to first form an AI Ethics Board to help ensure we balance both the risks and the benefits of deploying this technology. At Axon we are committed to ensuring that the technology we develop makes the world a better, and a safer place.

If there's anything to be disputed in the last paragraph of the post, it might be "looking to add facial recognition tech to its cameras." But more than one source (including the one linked in the paragraph) make the same claim about Axon looking at the possibility of adding this tech to its body camera line, so while Axon may not be currently working on it, it appears to be something it is considering. The addition of an ethics board is certainly the right way to approach this issue and its privacy concerns, but Axon's statement does not actually dispute the assertions I made in the post.

As for the rest of the paragraph, I will clarify that I did not mean Axon specifically will push for body cameras to become ALPRs but for faces. Axon likely won't. But police departments will. If the tech is present, it will be used. And history shows the tech will be deployed aggressively under minimal oversight, with apologies and policies appearing only after some damage has been done. To be certain, accuracy will be improved as time goes on. But as the UK law enforcement efforts show, deployment will far outpace tech advancements, increasing the probability of wrongful arrests and detentions.


Reader Comments

Subscribe: RSS

View by: Time | Thread


  • identicon
    Anonymous Coward, 9 May 2018 @ 8:20pm

    Let's imagine you have a needle in a haystack and you are tasked to find it because it's a murderous SOB. Let's imagine you have a technology called, say, haycial recognition, that can split that haystack into two piles, while letting you know that the murder-needle is in the far smaller pile. This is a good technology DESPITE having a high false-positive rate, because it's not being used to determine if something is a needle-what-kills, but to narrow the field of things to search through to find an already known death-kill-meanie-needle.

    The false positives in this blatant non-analogy aren't harming anything except the police budget, and in that case the technology is still saving more time and effort (and your taxes!) than would be spent without it.

    If you want to shit on facial recognition, shit on the aspect that deserves it, namely the destruction of privacy and abuses it enables. False positives are pretty irrelevant.

    reply to this | link to this | view in chronology ]

    • identicon
      Anonymous Coward, 9 May 2018 @ 9:37pm

      Re:

      Let's imagine you have a needle in a haystack and you are tasked to find it because it's a murderous SOB. Let's imagine you have a technology called, say, haycial recognition, that can split that haystack into two piles, while letting you know that the murder-needle is in the far smaller pile.

      You're assuming no false negatives, and that's probably not a valid assumption. The false negative rate seems unknowable. If police assume the murder-needle is not in the big pile but it actually is, that's gotta be a bad thing.

      The false positives in this blatant non-analogy aren't harming anything except the police budget

      Sure, and there's no reason taxpayers should be concerned about that, right?

      False positives are pretty irrelevant.

      That depends highly on what the police do when the system indicates a match. I assume that at a minimum they are detaining people to determine their identity, and that alone rises to a level above "irrelevant".

      reply to this | link to this | view in chronology ]

      • identicon
        Anonymous Coward, 10 May 2018 @ 9:37am

        Re: Re:

        That depends highly on what the police do when the system indicates a match.

        Being the UK,the first thing that they will likely do is have a close look at the indicated subject and the photos that they have of them. They will then look for an opportunity to approach and identify the person, maintaining formal politeness while doing so.

        reply to this | link to this | view in chronology ]

    • identicon
      Re:, 9 May 2018 @ 11:57pm

      Re:

      Except of course, you dont. You have forgotten about the false negatives.

      You have no idea if the system is giving you a true negative or a false negative. All you have done here is to check out all the positives to determine if they are true or negative.

      In your analogy all that has been done here is to scan the haystack, check out a couple of thousand pieces of hay and assume that was is left of the haystack does not contain the needle.

      reply to this | link to this | view in chronology ]

    • identicon
      Anonymous Coward, 10 May 2018 @ 8:04am

      Re:

      As others have mentioned... you don't actually know how many needles are in the haystack.

      It could be none, it could be 20,000.

      If you found 3 positives and 10 false positives out of 20,000 needles present, are you content that it's a job well done? Or was it a seriously flawed POS?

      We'll never know how well the software truly works, but relying on the facial recognition software to do the job with the assumption that it's better than not using the software - could be a major mistake.

      reply to this | link to this | view in chronology ]

    • identicon
      Anonymous Coward, 10 May 2018 @ 9:04am

      Re:

      The false positives may not harm much in your analogy but in real life they most certainly do.

      But then if you want to shit on the rights of others then you are certainly within your rights to do so - right? Damn the torpedoes - full speed ahead

      reply to this | link to this | view in chronology ]

    • identicon
      Anonymous Coward, 11 May 2018 @ 12:26am

      Re:

      The false positives in this blatant non-analogy aren't harming anything except the police budget

      This could be true, if the false positive is discovered before the arrest. What if it's discovered afterward?

      reply to this | link to this | view in chronology ]

  • identicon
    Peter, 9 May 2018 @ 11:43pm

    How did they determine the true accuracy?

    Did they determine that each of the scans was actually a correct negative and an incorrect negative, and not a false positive or true positive? Because of course just as important as ensuring they did not get a false positive on an innocent person was to ensure they did not get a 'pass' on a wanted person.

    In other words, did they check that every single scan of every person was returning the correct result in order to determine the true accuracy of the system? I am going to go out on a limb here and say, did they bollocks.

    reply to this | link to this | view in chronology ]

    • identicon
      Anonymous Coward, 10 May 2018 @ 12:35am

      Re: How did they determine the true accuracy?

      As almost every adult in the country has been photographed by the authorities, such as when they apply for a passport, driver's license, or other government-issued ID, it should be technically possible to identify practcally every single person in the stadium, and that's obviously not what's happening here -- but just give it time.

      reply to this | link to this | view in chronology ]

      • identicon
        Anonymous Coward, 10 May 2018 @ 9:08am

        Re: Re: How did they determine the true accuracy?

        To make such a system work, it would need a periodic update because people and their faces do not remain unchanged over time and their "AI" software does a poor job of predicting these changes.

        So - how often? Possibly, in the future, your vehicle may be disabled until you update your facial recognition profile ... daily.

        reply to this | link to this | view in chronology ]

        • identicon
          Anonymous Coward, 10 May 2018 @ 2:07pm

          Re: Re: Re: How did they determine the true accuracy?

          Fingerprints also change over time, and that's something that government agencies apparently don't know about, or care about.

          Fingerprints are not just used for finding 'bad' people, it's your badge of personal identity that the government has on file that's considered permanent (but really isn't, at least not in the eyes of a computer).

          Applying for US citizenship is one such example. If your current fingerprints don't precisely computer-match those on file that were taken by the INS many years or decades ago, then you are essentially a non-person in the eyes of the law and are not eligible to apply for citizenship. Proving that you are indeed still the same person could be an expensive, lengthy, and time-consuming uphill legal battle.

          So "what's the big deal?" you might ask. Even for people who never plan to ever vote in an election, citizenship becomes an extremely important issue in a person's old age, because US citizens can inherit a deceased spouse's property tax-free, while non-citizen permanent residents ("green card" holders) are hit with a whopping 40% federal tax on all assets upon the death of a spouse.

          But back to the subject of automatic facial recognition issues, its almost a certainty that it will end up being abused in some ways, or the system trusted more than it ever should be, such as with innocent people having to suffer because "the computer says so."

          reply to this | link to this | view in chronology ]

  • icon
    That One Guy (profile), 10 May 2018 @ 12:07am

    'More justifications for searches? Where do we sign?!'

    (It should worry law enforcement as well, but strangely does not seem to bother them.)

    If the UK police are even remotely similar to the US police then there's nothing 'strange' about it. More false positives means more chances to search people and possibly find something incriminating they can use to boost the 'look at all the criminals we're finding!' numbers.

    That this would have massive negative impacts on privacy of everyone searched is a sacrifice they are valiantly willing to have the public pay.

    reply to this | link to this | view in chronology ]

  • identicon
    Anonymous Coward, 10 May 2018 @ 12:24am

    Reid Technique going high tech

    That multi-colored facial recognition van reminds me of the UK's infamous television detector vans patrolling the streets that they tell the public can peer into houses and find unlicensed TV sets inside, prompting the residents caught this way to immediately fess up and pay the fine. Of course the claimed electronic wizardry was a lie, and the whole operation turned out to be just a form of Reid Technique applied to people who chose not to pay their TV tax.

    Likewise with AFR claims, we should be naturally skeptical in case there is some degree of smoke and mirrors going on and they're trying to fudge the test results or game the system in some way.

    Also, in a stadium full of people getting scanned, there would likely be hundreds, and possibly even thousands of people who are wanted by the police for some reason, mostly on minor offenses like failing to appear in court or pay a fine. Apparently most of them got missed and walked out of the stadium undetected.

    reply to this | link to this | view in chronology ]

  • identicon
    Anonymous Coward, 10 May 2018 @ 1:46am

    I feel like their public interest test is missing a few factors...

    reply to this | link to this | view in chronology ]

  • identicon
    Anonymous Coward, 10 May 2018 @ 5:33am

    Just wait until the New South Wales Police get this technology

    reply to this | link to this | view in chronology ]

  • identicon
    Annonymouse, 10 May 2018 @ 6:20am

    How to find a needle in a haystack?
    All it takes is a match. 🔥

    reply to this | link to this | view in chronology ]

  • icon
    Richard (profile), 10 May 2018 @ 6:40am

    This post and the next one

    "significant advantage" that no "co-operation" is required from a person.

    Ah - but of course GDPR (see next post) is supposedly all about guaranteeing co-operation...

    Except of course the law enforcement is exempt....

    Quelle Surprise!

    reply to this | link to this | view in chronology ]

  • identicon
    Anonymous Member of the Public, 10 May 2018 @ 7:35am

    Ridiculous Article

    Another piece of poor journalism from someone who doesn’t understand the technology and how it could help. I hope you don’t need to rely on the police to find a missing member of your family or prevent a terrorist attack using Facial Recognition.

    reply to this | link to this | view in chronology ]

  • identicon
    I.T. Guy, 10 May 2018 @ 7:36am

    You kidding, they love the false positives. Perfect excuse to violate 92% of peoples rights.

    reply to this | link to this | view in chronology ]

  • icon
    Ninja (profile), 10 May 2018 @ 8:13am

    The high false positive ratio would be less of a problem if we had proper law enforcement agents and procedures. By normal I mean: match detected, officers go to the person to confirm it visually, ask for IDs if needed and let the false positive go with a smile and politeness. This could be true in a few more civilized countries but we know that it may more often than not result in innocent deaths.

    Facial recognition isn't ready for prime time. Neither is law enforcement. Or rather, neither is humanity.

    reply to this | link to this | view in chronology ]

  • identicon
    Anonymous Coward, 10 May 2018 @ 8:39am

    doesn't matter. makes out the Police have more to do than people to do it and can get more innocent people locked up, as per the examples set by the USA!

    reply to this | link to this | view in chronology ]

  • identicon
    Anonymous Coward, 10 May 2018 @ 9:02am

    Surprising optimism by Techdirt author

    The worst case scenario for false positives triggered by law enforcement software is some time in jail and an arrest record.

    Based on other recent reporting, including on Techdirt, of officers using grossly excessive force with no penalty, I'd think this sentence should be:

    -- The worst case scenario for false positives triggered by law enforcement software is some unnecessary fatalities when the police shoot an innocent individual misidentified by the software as a match to an "armed and dangerous" fugitive cop-killer. If American cops were involved, figure some bystanders will get caught in the cross-fire too.

    reply to this | link to this | view in chronology ]

  • identicon
    Anonymous Coward, 11 May 2018 @ 2:56am

    <i>This addition will repurpose them into always-on surveillance devices</i>

    At least they will have their body camera turned on for once, likely with some request logs server-side (because there is no way the matching will happen on the devices themselves) to correlate with suspiciously missing footage.

    reply to this | link to this | view in chronology ]


Add Your Comment

Have a Techdirt Account? Sign in now. Want one? Register here
Get Techdirt’s Daily Email
Use markdown for basic formatting. HTML is no longer supported.
  Save me a cookie
Follow Techdirt
Techdirt Gear
Shop Now: I Invented Email
Advertisement
Report this ad  |  Hide Techdirt ads
Essential Reading
Techdirt Deals
Report this ad  |  Hide Techdirt ads
Techdirt Insider Chat
Advertisement
Report this ad  |  Hide Techdirt ads
Recent Stories
Advertisement
Report this ad  |  Hide Techdirt ads

Close

Email This

This feature is only available to registered users. Register or sign in to use it.