London Metropolitan Police Deploy Facial Recognition Tech Sporting A 100% Failure Rate

from the TOP.-TECH. dept

Facial recognition tech isn't working quite as well as the agencies deploying it have hoped, but failure after failure hasn't stopped them from rolling out the tech just the same. I guess the only way to improve this "product" is to keep testing it on live subjects in the hope that someday it will actually deliver on advertised accuracy.

The DHS is shoving it into airports -- putting both international and domestic travelers at risk of being deemed terrorists by tech that just isn't quite there yet. In the UK -- the Land of Cameras -- facial recognition tech is simply seen as the logical next step in the nation's sprawling web o' surveillance. And Amazon is hoping US law enforcement wants to make facial rec tech as big a market for it as cloud services and online sales.

Thanks to its pervasiveness across the pond, the UK is where we're getting most of our data on the tech's successes. Well... we haven't seen many successes. But we are getting the data. And the data indicates a growing threat -- not to the UK public from terrorists or criminals, but to the UK public from its own government.

London cops have been slammed for using unmarked vans to test controversial and inaccurate automated facial recognition technology on Christmas shoppers.

The Metropolitan Police are deploying the tech today and tomorrow in three of the UK capital's tourist hotspots: Soho, Piccadilly Circus, and Leicester Square.

The tech is basically a police force on steroids -- capable of demanding ID from thousands of people per minute. Big Brother Watch says the Metro tech can scan 300 faces per second, running them against hot lists of criminal suspects. The difference is no one's approaching citizens to demand they identify themselves. The software does all the legwork and citizens have only one way to opt out: stay home.

Given these results, staying home might just be the best bet.

In May, a Freedom of Information request from Big Brother Watch showed the Met's facial recog had a 98 per cent false positive rate.

The group has now said that a subsequent request found that 100 per cent of the so-called matches since May have been incorrect.

A recent report from Cardiff University questioned the technology's abilities in low light and crowds – which doesn't bode well for a trial in some of the busiest streets in London just days before the winter solstice.

The tech isn't cheap, but even if it was, it still wouldn't be providing any return on investment. To be fair, the software isn't misidentifying people hundreds of times a second. In a great majority of scans, nothing is returned at all. The public records response shows the Metro Police racked up five false positives during their June 28th deployment. This led to one stop of a misidentified individual.

But even if the number of failures is small compared to the number of faces scanned, the problem is far from minimal. A number of unknowns make this tech a questionable solution for its stated purpose. We have no idea how many hot list criminals were scanned and not matched. We don't know how many scans the police performed in total. We don't know how many of these scans are retained and what the government does with all this biometric data it's collecting. About all we can tell is the deployment led to zero arrests and one stop instigated by a false positive. That may be OK for a test run (it isn't) but it doesn't bode well for the full-scale deployment the Met Police have planned.

The public doesn't get to opt out of this pervasive scanning. Worse, it doesn't even get to opt in. There's no public discussion period for cop tech even though, in the case of mass scanning systems, the public is by far the largest stakeholder. Instead, the public is left to fend for itself as law enforcement agencies deploy additional surveillance methods -- not against targeted suspects, but against the populace as a whole. This makes the number of failures unacceptable, even if the number is a very small percentage of the whole.

Filed Under: facial recognition, london, metropolitan police


Reader Comments

Subscribe: RSS

View by: Time | Thread


  1. identicon
    Anon, 20 Dec 2018 @ 8:35am

    Several Points

    The issue with video is as much archiving as visibility - it's one thing to surveil someone or their property because they are an active subject in an ongoing investigation; another thing to compile dossiers, or archive collections, or whatever you want to call it, on citizens not involved in criminal activities. This is not the KGB - we don't build collections of data on anyone who happens to cross path with police collection processes.

    So the police can accidentally video your living room window - but they should not keep that data (or any data) forever, or even for months.

    The same applies to license plate data - perhaps the police compile lists of license plates. But they should not have a collection going back years, to show every movement you car has performed over that time - i.e. your complete movements for the past year. Maybe with a warrant they can collect ongoing data from these devices for specific individuals; the rest should be deleted in a reasonable time (a week? Two weeks?)

    Building a similar inaccurate database of "facially identified" people with a flawed program is only rife for abuse. "Evidence" will incriminate perfectly innocent people. "Your face was identified walking toward the crime scene. We have video of you in your living room 3 weeks before where you wear the same shirt the perp did. Our license plate reader saw you drive by 4 blocks from the crime scene an hour before. Please come with us."

    Of course, trying to test facial recognition with a location that contains probably one of the largest collection of different faces - major international tourist destinations - is sure to catch the largest possible incidence of doppelgangers. More interesting would be to see how many of these false positive faces were of other ethnic extractions. There's already articles suggesting the tech fails excessively for Chines and black faces.

Add Your Comment

Have a Techdirt Account? Sign in now. Want one? Register here



Subscribe to the Techdirt Daily newsletter




Comment Options:

  • Use markdown. Use plain text.
  • Remember name/email/url (set a cookie)

Follow Techdirt
Techdirt Gear
Shop Now: I Invented Email
Advertisement
Report this ad  |  Hide Techdirt ads
Essential Reading
Techdirt Deals
Report this ad  |  Hide Techdirt ads
Techdirt Insider Chat
Advertisement
Report this ad  |  Hide Techdirt ads
Recent Stories
Advertisement
Report this ad  |  Hide Techdirt ads

Close

Email This

This feature is only available to registered users. Register or sign in to use it.