London Metropolitan Police Deploy Facial Recognition Tech Sporting A 100% Failure Rate

from the TOP.-TECH. dept

Facial recognition tech isn’t working quite as well as the agencies deploying it have hoped, but failure after failure hasn’t stopped them from rolling out the tech just the same. I guess the only way to improve this “product” is to keep testing it on live subjects in the hope that someday it will actually deliver on advertised accuracy.

The DHS is shoving it into airports — putting both international and domestic travelers at risk of being deemed terrorists by tech that just isn’t quite there yet. In the UK — the Land of Cameras — facial recognition tech is simply seen as the logical next step in the nation’s sprawling web o’ surveillance. And Amazon is hoping US law enforcement wants to make facial rec tech as big a market for it as cloud services and online sales.

Thanks to its pervasiveness across the pond, the UK is where we’re getting most of our data on the tech’s successes. Well… we haven’t seen many successes. But we are getting the data. And the data indicates a growing threat — not to the UK public from terrorists or criminals, but to the UK public from its own government.

London cops have been slammed for using unmarked vans to test controversial and inaccurate automated facial recognition technology on Christmas shoppers.

The Metropolitan Police are deploying the tech today and tomorrow in three of the UK capital’s tourist hotspots: Soho, Piccadilly Circus, and Leicester Square.

The tech is basically a police force on steroids — capable of demanding ID from thousands of people per minute. Big Brother Watch says the Metro tech can scan 300 faces per second, running them against hot lists of criminal suspects. The difference is no one’s approaching citizens to demand they identify themselves. The software does all the legwork and citizens have only one way to opt out: stay home.

Given these results, staying home might just be the best bet.

In May, a Freedom of Information request from Big Brother Watch showed the Met’s facial recog had a 98 per cent false positive rate.

The group has now said that a subsequent request found that 100 per cent of the so-called matches since May have been incorrect.

A recent report from Cardiff University questioned the technology’s abilities in low light and crowds – which doesn’t bode well for a trial in some of the busiest streets in London just days before the winter solstice.

The tech isn’t cheap, but even if it was, it still wouldn’t be providing any return on investment. To be fair, the software isn’t misidentifying people hundreds of times a second. In a great majority of scans, nothing is returned at all. The public records response shows the Metro Police racked up five false positives during their June 28th deployment. This led to one stop of a misidentified individual.

But even if the number of failures is small compared to the number of faces scanned, the problem is far from minimal. A number of unknowns make this tech a questionable solution for its stated purpose. We have no idea how many hot list criminals were scanned and not matched. We don’t know how many scans the police performed in total. We don’t know how many of these scans are retained and what the government does with all this biometric data it’s collecting. About all we can tell is the deployment led to zero arrests and one stop instigated by a false positive. That may be OK for a test run (it isn’t) but it doesn’t bode well for the full-scale deployment the Met Police have planned.

The public doesn’t get to opt out of this pervasive scanning. Worse, it doesn’t even get to opt in. There’s no public discussion period for cop tech even though, in the case of mass scanning systems, the public is by far the largest stakeholder. Instead, the public is left to fend for itself as law enforcement agencies deploy additional surveillance methods — not against targeted suspects, but against the populace as a whole. This makes the number of failures unacceptable, even if the number is a very small percentage of the whole.

Filed Under: , ,

Rate this comment as insightful
Rate this comment as funny
You have rated this comment as insightful
You have rated this comment as funny
Flag this comment as abusive/trolling/spam
You have flagged this comment
The first word has already been claimed
The last word has already been claimed
Insightful Lightbulb icon Funny Laughing icon Abusive/trolling/spam Flag icon Insightful badge Lightbulb icon Funny badge Laughing icon Comments icon

Comments on “London Metropolitan Police Deploy Facial Recognition Tech Sporting A 100% Failure Rate”

Subscribe: RSS Leave a comment
52 Comments
Bamboo Harvester (profile) says:

While I...

…abhor street cameras, an obvious 4th violation, this line:

“We have no idea how many hot list criminals were scanned and not matched.”

is a problem. Headlines of 100% *failure* rate, but if there were NO “hot list criminals” in the “sample group”, the software is 100% *effective*.

Putting ten or twenty test subjects in the crowd deliberately would give a more accurate reading. But then it would be a test rather than a deployment. Can’t run tests after the sucker … er… “client” has already bought the package, after all…

Anonymous Coward says:

Re: While I...

Headlines of 100% failure rate, but if there were NO "hot list criminals" in the "sample group", the software is 100% effective

The 100% failure rate was the rate at which the software incorrectly identified people as a criminal, not the failure to identify any actual criminals (though that also happened). Thus, we are at 0% effective.

James Burkhardt (profile) says:

Re: While I...

The 100% failure rate in the headline was, in the article, shown to reference the 100% false positive rate.

In May, a Freedom of Information request from Big Brother Watch showed the Met’s facial recog had a 98 per cent false positive rate.

The group has now said that a subsequent request found that 100 per cent of the so-called matches since May have been incorrect.

The line you quote references false negatives.

If the software had no criminals but flagged multiple people as criminals, it most certainly had a 100% failure rate – every flag was a failure.

There is in fact, in this data, no way to find anything but 100% failure. Best case, no criminals were present and all that was available to flag were innocents. 100% failure by finding criminals where none existed. Certainly not 100% effective.

The more criminals actually present and in the database, the worse it looks, because then it identified innocents as criminals, and failed to find the actual criminals.

PaulT (profile) says:

“street cameras, an obvious 4th violation”

Erm, I’d love to hear your logic on this one. Isn’t the fourth the “right of the people to be secure in their persons, houses, papers, and effects”? How does that apply to recording a public street?

“if there were NO “hot list criminals” in the “sample group”, the software is 100% *effective*.”

This, however, is definitely true. But, there’s likely to have been some testing with other subjects before these public ones and they probably can’t legally target people without their consent (then said consent might lead to accusations of pre-programmed bias if the software is successful).

Bamboo Harvester (profile) says:

Re: Re:

Because it’s a rarity for a street camera to not catch some private property in it’s view as well.

I own several rental buildings. I’ve got cameras on all of them, but was very careful to make sure none of them are catching views of the neighboring properties.

If a stoplight mounted camera has apartment or house windows in it’s field of view, how can it not be a 4th violation? It’s effectively surveilling that window.

PaulT (profile) says:

Re: Re: Re:

I don’t know, that seems like a stretch to me. By that logic, you can commit any crime you want in front of a window that faces the street and it’s a 4th amendment violation if an officer sees you and decides to act on it. Hell, just set up your crack den across the street from the police station, don’t install curtains and do whatever you want!

I understand privacy concerns and would certainly hope that those setting the cameras up were as mindful as you. But, I think it’s really stretching the point to say that any camera surveillance in the public street is a violation.

Bamboo Harvester (profile) says:

Re: Re: Re: Re:

Traffic patterns if a private driveway is in their view. Archiving the comings and goings at the doors.

I view them like the automated license plate readers.

Prior to the readers, an officer had to have a REASON to call in a plate. They didn’t do it gratuitously, as it would annoy the hell out of the clerk running the search.

And they got ONLY the registered owner and information on the vehicle itself.

With the ALPR, they get not just that info, but it’s linked in dozens of databases, giving all kinds of information “at the push of a button” that the officer does not need to know for any reason.

And the ALPR treats everyone as guilty. It scans EVERY plate that goes past it.

The same for the street cameras, especially where they infringe on private property. Would you put up with your neighbor setting up a camera to watch your teenaged daughters in your pool? Of course not. So why are the cops “special” in this matter?

I give all my tenants access to the camera footage, with real-time available for the one covering the front door.

Ever try to get footage from a traffic camera?

Scary Devil Monastery (profile) says:

Re: Re: Re:3 Re:

"Of course you wouldn’t want someone filming your daughters. Too bad there’s nothing you could legally do to make them stop. "

What.

Seriously, look at…most examples of western law on this. If you manage to film someone in such a way that it intrudes upon or violates personal integrity of others then that is, by definition, breaking the law.

Hence why placing a camera is usually circumscribed with legal concerns – you WILL be held responsible for what it captures.

And as someone pointed out in this thread already, police appear to be getting away with this since there is literally no overview.

PaulT (profile) says:

Re: Re: Re:4 Re:

“And as someone pointed out in this thread already, police appear to be getting away with this since there is literally no overview.”

Are they “getting away with it”? Or, is it just that there’s different standards applied to someone deliberately targeting your property than applied to someone happening to catch a small part of your property when looking at the public street? If the scenario he fears is even happening at all.

PaulT (profile) says:

Re: Re: Re:2 Re:

“Traffic patterns if a private driveway is in their view. Archiving the comings and goings at the doors.”

So, the same as anyone in the street can see, or sat in an opposing property. Do you consider a stakeout to be a violation as well, if they can see your house as well as the intended target?

“And the ALPR treats everyone as guilty. It scans EVERY plate that goes past it.”

Every plate that’s out being driven on public roads, sure. You appear to be having problem with the automation, not the camera, which is a different issue.

“So why are the cops “special” in this matter?”

Because they are in theory putting them up in order to monitor and protect the public streets, which is a big part of their job.

You may disagree with how they operate in reality, and how necessary the tools actually are. But, it’s not exactly hard to see why a camera that may see the corner of your property that’s visible from the public street while monitoring that street is different from one set up specifically to view your property.

Bamboo Harvester (profile) says:

Re: Re: Re:3 Re:

If the police want to watch and film a private property, they’re required to get a Warrant.

If a traffic cam is watching a private property, the police have access to the footage captured without a Warrant.

THAT is why it’s a 4th violation (in the US).

It’s also not unknown for traffic cameras to get “accidentally” moved so they’re watching a window or driveway instead of the intersection.

Blake A Senftner says:

Re: Re: Re:

Think of Facial recognition as a camera that is blind to anything but faces, plus the faces need to be generally facing the camera and at least some minimum size, such as 160 pixels tall. A traffic camera lens is such that outside of the traffic intersection, human heads are too small for FR to identify, so it is effectively blind past the intersection.

Cdaragorn (profile) says:

Re: Re: Re:

It’s long established fact that watching private property from somewhere you’re legally allowed to be is never a violation of the 4th and is within your legal right to do.

Intruding upon someone’s privacy is not the same thing as someone opening their private areas up for full display to the public. The only possible issue for the government is that it’s not allowed to keep those images for very long unless except for any it has probably cause to connect to some actual crime.

Anonymous Coward says:

Re: Re: Re: Re:

One could test this theory in a real life situation.

For example, one could park their vehicle on the public road in a spot where parking is allowed and point several cameras at one particular private property. Use of binoculars will help in the test scenario.

Then simply wait for the police to show up … or worse – the vigilantes.

Bamboo Harvester (profile) says:

Re: Re:

100% effective…

I’d assume the manufacturer did some sampling using it’s own employees.

But for the cops not to do the same makes it very difficult to claim the software match was sufficient cause for a stop. The days of people believing computers don’t make mistakes are long gone.

Now, if along with their “hot list” of criminals, they scanned EVERY cop into the system, AND the system matched them frequently (and correctly), they’d have grounds to have such stops accepted as evidence (in the US, not up on Brit law).

And how could any cop decline to be scanned in? After all, if you’ve got nothing to hide, you’ve got nothing to worry about, right?

Bamboo Harvester (profile) says:

Re: Re: Re: Re:

Disagree. ALL cops are fingerprinted. It has almost no effect on OTHER prints found at a crime scene.

I’m not saying to add the cops to a test pool – I’m saying ALL cops should be REQUIRED to be in the database, and matches to them flagged.

Think what the bodycams were supposed to do.

If they’re doing facial recognition on crowds of innocents, how can they justify NOT having the cops in that database?

Cops are a necessary evil. Early in the dim mists of time, society as a whole realized it was marginally better to have them on the inside urinating out than outside urinating in.

But they need watching, and what better tools to watch them than the ones they deploy against every non-cop?

PaulT (profile) says:

Re: Re: Re:2 Re:

You seem to be completely missing the point. This is not about whether or not cops are being monitored. The point is that this is a public beta test. You can’t have an effect beta test that will hold up to scrutiny if the only test subjects are employees of the people running the tests.

“I’m saying ALL cops should be REQUIRED to be in the database, and matches to them flagged.”

They may well be – AFTER the tests are concluded. If you’re calling for them to be in the tests themselves, you’re asking for potentially biased tests – the results of which will be used to justify full rollout of this technology, both in the UK and US. I’m sure that’s not what you mean to be asking for, but you are.

James Burkhardt (profile) says:

Re: Re: Re:

I Disagree. As a rule, computers don’t make mistakes. Except in rare occations, they do what they are told to do.

The issues are A) people don’t understand what they are telling the computer to do and B) We are bad at telling a computer how to do a bunch of things that we do at an instinctual level. C) Software Engineers and Police don’t think the same way D) The computers at issue are dealing with maybes but are being programmed with yes or no responses.

I see it all the time in my office. People THINK the computer is just doing “whatever it wants”. But they don’t understand what the computer is doing, how it is doing it, so when it acts differently than they expect, its ‘going crazy’.

TripMN says:

Re: Re: Re: Re:

James, aren’t you a lawyer by trade?

As a software engineer, I commend you on your understanding of computers that many outside of the engineering fields don’t get. Computers do what they are told, very quickly and very efficiently. We usually run into trouble with “software” and “data” because the humans involved don’t fully understand what they’ve told the computer to do or don’t understand how their code will act/react on data coming in from the real world.

Works in the lab/on my computer is often taken to mean that things will work the same in the real world … and they never do.

nasch (profile) says:

Re: Re: Re: Re:

As a rule, computers don’t make mistakes.

That depends on how you define making a mistake. And it gets much fuzzier when you throw in machine learning (I’m not sure if this system uses that). Even without it, if the system is supposed to identify people in a database, and it has false positives, I’d say the system (meaning the software and hardware) made a mistake. Did the machine correctly follow the instruction sets that were fed to the processor? Yes, but we can look at computer systems at a higher level than that, and analyze whether they are fulfilling their function correctly. In this case, this system was not.

Anon says:

Several Points

The issue with video is as much archiving as visibility – it’s one thing to surveil someone or their property because they are an active subject in an ongoing investigation; another thing to compile dossiers, or archive collections, or whatever you want to call it, on citizens not involved in criminal activities. This is not the KGB – we don’t build collections of data on anyone who happens to cross path with police collection processes.

So the police can accidentally video your living room window – but they should not keep that data (or any data) forever, or even for months.

The same applies to license plate data – perhaps the police compile lists of license plates. But they should not have a collection going back years, to show every movement you car has performed over that time – i.e. your complete movements for the past year. Maybe with a warrant they can collect ongoing data from these devices for specific individuals; the rest should be deleted in a reasonable time (a week? Two weeks?)

Building a similar inaccurate database of “facially identified” people with a flawed program is only rife for abuse. “Evidence” will incriminate perfectly innocent people. “Your face was identified walking toward the crime scene. We have video of you in your living room 3 weeks before where you wear the same shirt the perp did. Our license plate reader saw you drive by 4 blocks from the crime scene an hour before. Please come with us.”

Of course, trying to test facial recognition with a location that contains probably one of the largest collection of different faces – major international tourist destinations – is sure to catch the largest possible incidence of doppelgangers. More interesting would be to see how many of these false positive faces were of other ethnic extractions. There’s already articles suggesting the tech fails excessively for Chines and black faces.

Thad (profile) says:

Re: Re: Re: Re:

Person of Interest was pretty good. I mean, the first season was pretty much just a standard cop show with a magic computer, but after that it turned into a much deeper and more interesting SF show about the ramifications of surveillance and AI.

It’s really kind of a fascinating middle-step between The Dark Knight and Westworld (Jonathan Nolan wrote or co-wrote all three).

Anonymous Coward says:

Re: Re: Re: Re:

Sometimes reality is decades behind. I’m still waiting for software that will take a two-dimentional photograph and map it into a 360-degree 3D model by combining all the distorted reflections off glass and other shiny surfaces to reconstruct everything that’s out of view of the camera and basically “see around corners”. Maybe one day such imaging tools won’t just be science fiction anymore.

That One Guy (profile) says:

'You first'

Ignoring for a moment privacy implications, want to make sure that the tech is tested and accurate before it’s aimed at the public? Aim the cameras at the entrance to police stations and government buildings first, with the public given the same access to that data as the police get to the data from cameras aimed at the public.

I suspect that accuracy(or lack thereof) would suddenly become a very important selling point, practically overnight.

Beta (profile) says:

Dogberry tech

"The tech is basically a police force on steroids — capable of demanding ID from thousands of people per minute… The difference is no one’s approaching citizens to demand they identify themselves."

To be fair, there is another difference: the tech does not detain those it cannot identify — at least, not yet. Shakespeare himself made fun of watchmen who behave like that. ("Why, then, take no note of him, but let him go; and presently call the rest of the watch together and thank God you are rid of a knave.")

Add Your Comment

Your email address will not be published. Required fields are marked *

Have a Techdirt Account? Sign in now. Want one? Register here

Comment Options:

Make this the or (get credits or sign in to see balance) what's this?

What's this?

Techdirt community members with Techdirt Credits can spotlight a comment as either the "First Word" or "Last Word" on a particular comment thread. Credits can be purchased at the Techdirt Insider Shop »

Follow Techdirt

Techdirt Daily Newsletter

Ctrl-Alt-Speech

A weekly news podcast from
Mike Masnick & Ben Whitelaw

Subscribe now to Ctrl-Alt-Speech »
Techdirt Deals
Techdirt Insider Discord
The latest chatter on the Techdirt Insider Discord channel...
Loading...