Yet Another Bad Idea: Dropping Facial Recognition Software Into Police Body Cameras

from the Citizen-Rolodex dept

The FBI (and other US government agencies) are already moving forward with facial recognition technology, which will allow law enforcement to scan people like license plates, if everything goes to plan. So far, consultation with the meddling public has been kept to a minimum, as have any government efforts to address civil liberties concerns.

Just because the public’s been kept out of the loop (except for, you know, their faces and other personal information), doesn’t mean members of the public aren’t working hard to ensure police officers can start running faces like plates, even when there’s no legitimate law enforcement reason for doing so.

Digital Barriers, a somewhat ironically-named tech company, is pushing its latest law enforcement offering — one that supposedly provides real-time face scanning.

The software can pick out and identify hundreds of individual faces at a time, instantly checking them against registered databases or registering unique individuals in seconds.

Demonstrating the software at the Forensics Europe Expo 2017, vice president of Digital Barriers Manuel Magalhaes said the company was introducing the technology to UK forces.

He said: “For the first time they (law enforcement) can use any surveillance asset including a body worn camera or a smartphone and for the first time they can do real time facial recognition without having the need to control the subject or the environment.

“In real time you can spot check persons of interests on their own or in a crowd.”

But why would you? Just because it can be done doesn’t mean it should be done. This will basically allow officers to run records checks on everyone who passes in front of their body-worn cameras. There is nothing in the law that allows officers to run checks on everyone they pass. They can’t even stop and/or frisk every member of the public just because they’re out in public. Expectations of privacy are lowered on public streets, but that doesn’t make it reasonable to subject every passerby to a records check. And that’s without even factoring in the false positive problem. Our own FBI seems to feel a 15% bogus return rate is perfectly acceptable.

Like so much surveillance equipment sold to law enforcement agencies, Digital Barrier’s offering was developed and tested in one of our many war zones. The head of the company is inordinately proud of the product’s pedigree, which leads to a statement that could be taken as bigoted if it weren’t merely nonsensical.

Mr Magalhaes continued: “If we can overcome facial recognition issues in the Middle East, we can solve any facial recognition problem here in the United Kingdom.

Hopefully, this just refers to the sort of issues normally found in areas of conflict (hit-and-miss communications infrastructure, harsher-than-usual working conditions, etc.), rather than hinting Middle Eastern facial features are all kind of same-y.

Taking the surveillance out of the Middle East isn’t going to solve at least one logistical problem keeping this from becoming a day-to-day reality for already heavily-surveilled UK citizens. As is pointed out by officers in the discussion thread, Digital Barrier’s real-time face scanning is going to need far more bandwidth than is readily available to law enforcement. One commenter notes they can’t even get a strong enough signal to log in into their email out in the field, much less perform the on-the-fly facial recognition Digital Barrier is promising.

The other pressing issues — according to the law enforcement members discussing the post — is one far more aligned with the general public’s. A couple of members point out no one PNC’s entire crowds (referring to the UK’s law enforcement database: the Police National Computer) and that doing so might not even be legal.

Unfortunately, the rank-and-file rarely get to make these decisions. These choices will be made by people who think the public needs to give til it hurts when safety and security are on the line. Dropping this capability into body cameras will make them more of an intrusion on the lives of citizens and far less likely to result in police accountability. Faces being linked automatically to databases full of personal info creates complications in obtaining camera footage. It won’t result in improved policing, even though there are plenty of supporters who mistakenly believe “easier” is synonymous with “better.”

Filed Under: , ,

Rate this comment as insightful
Rate this comment as funny
You have rated this comment as insightful
You have rated this comment as funny
Flag this comment as abusive/trolling/spam
You have flagged this comment
The first word has already been claimed
The last word has already been claimed
Insightful Lightbulb icon Funny Laughing icon Abusive/trolling/spam Flag icon Insightful badge Lightbulb icon Funny badge Laughing icon Comments icon

Comments on “Yet Another Bad Idea: Dropping Facial Recognition Software Into Police Body Cameras”

Subscribe: RSS Leave a comment
Anonymous Anonymous Coward (profile) says:

Papers Please. No, forget the please, just give me your damn documents that verify your right to exist.

Just how many steps away from having government required serial numbers tattooed to our foreheads at birth, worldwide?

At some point there WILL be revolt, and it may spread worldwide. Then what do governments do? Are they really as short sighted as the auto-trading algorithms that Wall Street uses?

I think yes.

Anonymous Coward says:

Re: Papers Please. No, forget the please, just give me your damn documents that verify your right to exist.

I think governments are pretty happy with the current situation. Most people carry around a tracking device and use cashless forms of payment. A revolt is unlikely while everyone continues to fund their own surveillance.

That Anonymous Coward (profile) says:

Re: Papers Please. No, forget the please, just give me your damn documents that verify your right to exist.

I am really annoyed with people crediting me with the whole number assigned to people thing.
Just because they say I’m going to do it before the end times, doesn’t mean I am.

They’ve screwed everything up way more than I ever could have.

Anonymous Coward says:

Perhaps this isn't such a bad idea...

What if there was an app for smart phones linked to a facial recognition database of police officers accused of excessive violence, all-around being a dick, harassment, etc. that immediately puts the camera into record mode, while calling a lawyer?

I’d say we could scale that back to convicted police officers, but since they’re rarely if ever held accountable for anything, that database would be mostly useless.

aerinai says:

If a person can do it, a computer should be able to do it.

So I’m definitely going to swim against the current on this one a little bit… This technology isn’t inherently bad, just how it is used (like license plate scanners).

If the police were told ‘Bob Jones is a bail jumper wanted for murder, here’s his picture, be on the lookout’ and then a cop pulls over someone that meets this description, I’d say that the officer is using his best judgement. I don’t think anyone would fault the officer for doing this. I would also bet money, that the officer is wrong much higher than 15% of the time. I can see this technology being great at catching criminals assuming they don’t take it too far.

I alluded to license plate scanners… if you just scan and dump when there is no reason to keep the information (or even a transient 72 hour hold), I don’t see a big deal with this. A cop could do the exact same thing with his brain. You are just automating it… But the problem comes in when you begin to perpetually store this stuff and start using that data to cross check and query… THAT is where it crosses a line.

I would also say giving personal information to the officer about a person that he doesn’t have a reason to know is also a step too far… hooking in facial recognition to Facebook or a database of non-violent felons. Assuming they are just doing this stuff behind the scenes in a computer somewhere (and the data is dumped after a period), I’m ok with this.

However, this isn’t the software or technology’s fault, it is how it is implemented and used.

And yes… I know that they will totally be abusing this… but assuming they put in proper safe guards (which they probably won’t…) I would be fine with this.

Anonymous Coward says:

Re: If a person can do it, a computer should be able to do it.

If the technology exists it will be abused. For now people rail against the technology. Then, when it is inevitably abused, people will rail against the abuse instead.

All of the railing is hollow, however. Until people get sufficiently pissed off to do something about it nothing will be done. And by the time people get sufficiently pissed off it may well be too late.

sigalrm (profile) says:

Re: How accurate?

"I’m curious how accurate this particular implementation of the technology is…"

The article above states a false positive rate of 15%. To put that in perspective, Centurylink Field in Seattle has a listed max-capacity of 67,000 people, which means that over the course of a Seahawks game, as many as 10,050 people would be misidentified.

"Seems like it would be just a matter of time until the incorrect person is identified, runs because they’re scared and gets killed."

Not a problem – qualified immunity covers that scenario. Besides, "only guilty people run". /s

sigalrm (profile) says:

Re: Re: How accurate?

I just realized, the figure above is for American football, and Digital Barriers is a UK Company so let’s try this:

According to Wikipedia, Emirates Stadium, Home of the Arsenal FC in London, England has a capacity (according to Wikipedia) of “over 60,000”.

Assuming a 15% false positive rate, grade-school math says ~9000 people per Arsenal game would be misidentified.

Anonymous Coward says:

…which leads to a statement that could be taken as bigoted if it weren’t merely nonsensical.

Hopefully, this just refers to the sort of issues normally found in areas of conflict (hit-and-miss communications infrastructure, harsher-than-usual working conditions, etc.),

So which is it? Is it nonsensical, or is it potentially referring to very real problems with deploying new technology in real-life situations.

I’m willing to give Tim the benefit of the doubt on leveling thinly supported racism accusations against his "opponent," but claiming the statement is nonsensical beforehand, and then providing several perfectly reasonable and easily deducible explanations immediately afterward is pushing the bounds of credulity.

orbitalinsertion (profile) says:

Re: Re:

It’s a standard problem with all manner of things which are designed or studied against the dominant culture (or WEIRD groups), which has inherent bigoted assumptions given they always want to deploy or generalize to the whole world. In fact, usually not against the group they took their assumed metrics from whatsoever. Even if the guy isn’t making some kind of allusion to “they all look alike”, he is still implying it because why bother mentioning “in the Middle East”?

Yes, there is a problem with deploying the technology, and technologically it is irrelevant against whom you are deploying it.

It could simply mean that usage in Middle East was pretty much their only market prior, but that there is pretty much predicated on bigotry in the first place, even if Magalhaes didn’t just happen to process a bit of endemic cultural racist script or outright intentionally other “Middle Easterners”.

Was it worth noting? I probably would have noticed for a moment myself. But there is a lot of baked-in cultural bigotries, sometimes subtle, but generally the responses to potential moments in bigotry (conscious or unconscious) being pointed out are a bit telling.

Anonymous Coward says:

It is already going to be applied

Don’t we already know that a prominent company is offering to supply cops around the country with free bodycams, in return for exclusive rights to the video from those cams? Do you honestly think they aren’t planning on using any and every tool at their disposal to data-mine that video for as much as they possibly can? Soon we will see CCTV cams going up everywhere with similar terms turning public spaces into privacy free zones.

Roger Strong (profile) says:

There are people who keep being given the pavement taste test by police because a criminal has stolen their identity. Ever time their name is removed from police records, it gets added back in when police share data with another jurisdiction.

Now it’ll only take roughly the same bone structure in your face. And with daily crowd-level scanning by large numbers of police, there’s going to be A LOT of false positives.

Digital Barriers – Soon to be the answer to the trivia question "How are so many Britons able to identify an area based on the taste of the pavement?"

Faceguy says:

Facial Recognition

Don’t fool yourself. There is software with a far better false positive rate than 15%. It would have been available back in 2008, but the big crash stopped any financial backing to build the company. A year later, it went belly up. I know it worked.

Not that I am concerned about it today. It wasn’t hard to see where it was going, and what it could possibly be used for. Where it went after that is anyone’s guess. I simply moved on, but I do know it existed. I worked on it!

I didn’t find 15% acceptable. We were in the 3-5% false rate, and I didn’t find that acceptable. In those cases, it simply said no match. It could distinguish between identical twins! That always amazed me. At that time, given a large enough database, that certainly beat the current software. By now it could have been much better!

Roger Strong (profile) says:

Re: Facial Recognition

Suppose they get the false rate down to 2%.

We’re talking here about cameras on cops in crowds. Say, 30 cops walking around downtown London or at a major sporting event or concert. The cameras each scanning 2000 faces an hour.

That’s 1200 false positives an hour. Compared to, what, one actual wanted person a week? That’s still not practical.

It reminds me of a review of some speech-to-text software a few years back: Voice artists might get better results, but the average schmuck was only going to get a “mere” 98% accuracy. The hassle of correcting the software every 50 words was still enough to render it not worth using.

Anonymous Coward says:

Re: Re: Facial Recognition

Facial recognition software can be tuned. I’m guessing for civilian use, they are going to set the dials to accept some false negatives to avoid false positives.

Text to speech software works remarkably well now for some applications. I use it daily for sending text messages because I can compose a 100 character message faster with my voice than my fingers. There might be an error or two in the text but most of the time the errors aren’t substantial.

orbitalinsertion (profile) says:

Re: Facial Recognition

So they are/were leaps and bounds ahead of everyone else whose implementations still give significant false positive rates?

I suppose it doesn’t matter since cops can always claim they are better than FR at a thousand meters in the dark in rain and fog when it turns out either or both misidentified someone. Don’t care what FR, DNA tests, ID, or anything else says, he’s the guy! I had to empty three clips into him. He was gonna turn me into a newt.

Anonymous Coward says:

Which has a higher rate of false positives … facial regression or roadside drug tests?

It does not matter, they can just say their extensive training allows them to determine you need a good beatin and whatnot. Jeffy boy Sessions (who looks like Granny from the Beverly Hill Billies) is here to save us all from the depraved criminals out there who belong in prison … gotta protect those dividends.

My_Name_Here says:

I previously warned that insisting on increased usage of body cameras on police would cause problems. Footage of victims would be stored because Tim Cushing wants his thrills of catching police in “gotcha!” moments.

And now the pendulum has swung the other way. Enjoy your fleeting privacy, Cushing. You asked for it.

Faceguy says:

Facial Recognition

In my experience, the higher the camera resolution, the lower the fail rate. I really doubt an officer’s chest cam could have that high a storage for a day’s worth of video. Now, the new 4k cameras would have cut the 2008 cameras fail rate down to 1-2 percent range. The system I worked on used two cameras, so image storage would have been doubled.

I do agree wireless “real time” transmissions to a processing site would be the main bottleneck, even with compression of the data.

Add Your Comment

Your email address will not be published.

Have a Techdirt Account? Sign in now. Want one? Register here

Comment Options:

Make this the or (get credits or sign in to see balance) what's this?

What's this?

Techdirt community members with Techdirt Credits can spotlight a comment as either the "First Word" or "Last Word" on a particular comment thread. Credits can be purchased at the Techdirt Insider Shop »

Follow Techdirt

Techdirt Daily Newsletter

Techdirt Deals
Techdirt Insider Discord
The latest chatter on the Techdirt Insider Discord channel...