Las Vegas Police Are Running Lots Of Low Quality Images Through Their Facial Recognition System

from the that's-going-to-end-badly dept

Even when facial recognition software works well, it still performs pretty poorly. When algorithms aren’t generating false positives, they’re acting on the biases programmed into them, making it far more likely for minorities to be misidentified by the software.

The better the image quality, the better the search results. The use of a low-quality image pulled from a store security camera resulted in the arrest of the wrong person in Detroit, Michigan. The use of another image with the same software — one that didn’t show the distinctive arm tattoos of the non-perp hauled in by Detroit police — resulted in another bogus arrest by the same department.

In both cases, the department swore the facial recognition software was only part of the equation. The software used by Michigan law enforcement warns investigators search results should not be used as sole probable cause for someone’s arrest, but the additional steps taken by investigators (which were minimal) still didn’t prevent the arrests from happening.

That’s the same claim made by Las Vegas law enforcement: facial recognition search results are merely leads, rather than probable cause. As is the case everywhere law enforcement uses this tech, low-quality input images are common. Investigating crimes means utilizing security camera footage, which utilizes cameras far less powerful than the multi-megapixel cameras found on everyone’s phones. The Las Vegas Metro Police Department relied on low-quality images for many of its facial recognition searches, documents obtained by Motherboard show.

In 2019, the LVMPD conducted 924 facial recognition searches using the system it purchased from Vigilant Solutions, according to data obtained by Motherboard through a public records request. Vigilant Solutions—which also leases its massive license plate reader database to federal agencies—was bought last year by Motorola Solutions for $445 million.

Of those searches, 471 were done using images the department deemed “suitable,” and they resulted in matches with at least one “likely positive candidate” 67% of the time. But 451 searches, nearly half, were run on “non-suitable” probe images. Those searches returned likely positive matches—which could mean anywhere from one to 20 or more mugshots, all with varying confidence scores assigned by the system—only 18% of the time.

Fortunately, low-quality images seemingly rarely return anything investigators can use. (Although that 18% is still 82 “likely positive matches…”) If the system did, we’d be seeing far more bogus arrests than we’ve seen to this point. Of course, prosecutors and police aren’t letting suspects know facial recognition software contributed to their arrests, so courtroom challenges have been pretty much nonexistent.

Although most of the information in the documents is redacted — making it difficult to verify LVMPD claims about the software’s contribution to arrests and prosecutions — enough details remained to provide a suspect facing murder charges with information the LVMPD had never turned over to him or admitted to in court.

Clark Patrick, the Las Vegas attorney representing [Alexander] Buzz, told Motherboard that neither the LVMPD nor the Clark County District Attorney’s office ever informed him that investigators identified Buzz as a suspect using, at least in part, facial recognition technology. The Clark County District Attorney’s office did not respond to an interview request or written questions.

Had this information been given to Buzz and his attorney at the beginning of the trial, he likely would not have waived his right to a preliminary evidentiary hearing. If this had taken place — along with knowledge of a private company’s contribution to the investigation — prosecutors may have had to produce information about the tech and the surveillance footage it pulled images from.

The documents don’t appear to show a reliance on low-quality images to make arrests, but they do show investigators will run nearly any image through the software to see if it generates some hits. The precautions taken after this matter most. If investigators are only considering matches to be leads, it will head off most false arrests. But if investigators take shortcuts — as appears to have happened in Detroit — the outcome is disastrous for those falsely arrested. A person’s rights and freedoms shouldn’t be at the mercy of software that performs poorly even when given good images to work with. The use of this software is never going to go away completely, but agencies can mitigate the damage by refusing to treat matches as probable cause.

Filed Under: , , ,

Rate this comment as insightful
Rate this comment as funny
You have rated this comment as insightful
You have rated this comment as funny
Flag this comment as abusive/trolling/spam
You have flagged this comment
The first word has already been claimed
The last word has already been claimed
Insightful Lightbulb icon Funny Laughing icon Abusive/trolling/spam Flag icon Insightful badge Lightbulb icon Funny badge Laughing icon Comments icon

Comments on “Las Vegas Police Are Running Lots Of Low Quality Images Through Their Facial Recognition System”

Subscribe: RSS Leave a comment
OGquaker says:

Nothing here, moving on

Back in the 1950’s when every newspaper in South America was ‘Classified’ and restricted in the US, one paper on the west coast (That west coast, not this one) published the same photo whenever an Asian man was the subject of a news story. Since all US Peace Officers are convinced (a verb) that all of us are in some way criminal, and could be dangerous, any photo will do.

ECA (profile) says:


You install a Chip with a DNA sample to be registered ON ENTRY to any business..
There is no Perfect solution.. N
NOT even near, a good Solution.

Until you can give a computer the ability to discern racial differences.. and be at least 99% correct, you wont get more then 10%.(IMO)
Tons of companies that sell BASIC, security devices dont tell you they have CRAP camera’s.
I dont think MOST of them even know the restrictions OF the camera’s. LIKE: never point to a SUN FILLED window. You cant see much of anything.

Add Your Comment

Your email address will not be published. Required fields are marked *

Have a Techdirt Account? Sign in now. Want one? Register here

Comment Options:

Make this the or (get credits or sign in to see balance) what's this?

What's this?

Techdirt community members with Techdirt Credits can spotlight a comment as either the "First Word" or "Last Word" on a particular comment thread. Credits can be purchased at the Techdirt Insider Shop »

Follow Techdirt

Techdirt Daily Newsletter

Techdirt Deals
Techdirt Insider Discord
The latest chatter on the Techdirt Insider Discord channel...