Facial Recognition Rings Up Another False Arrest, Leading To The Accused Being Brutalized In Jail

from the guess-we're-just-going-to-do-this-repeatedly dept

Facial recognition may be helping law enforcement catch bad guys, but inherent flaws in these systems ensure it’s only a matter of time before the AI coughs up yet another unforced error.

That sort of error rate might be acceptable when the AI is doing nothing more than pitching in to refine Google search results or, I don’t know, crafting articles for national sports publications. But it’s far more harmful when it’s helping the government deprive people of their freedom.

These may seem like anomalies when compared to the massive number of facial recognition searches law enforcement performs every day. But each mistake causes falsely accused people to pay a heavy price — something that goes far beyond a short detainment.

Someone merely accused of a crime can expect to have their options for pretty much anything (financial assistance, housing, employment) negatively impacted. Any period of incarceration is far more than an inconvenience. Even short stays can result in lost jobs, evictions, or reputation injury that lasts far longer than whatever time is spent locked up.

This case, covered by Matthew Gault for Vice, involves a private company’s use of facial recognition tech. But that tech is used to report people to law enforcement, which means the government quickly brings its power to bear in cases like these, to the detriment of life and liberty.

The details of this case are horrific. What started out as a mistaken armed robbery accusation soon turned into a nightmare for Houston, Texas resident Harvey Murphy.

According to a lawsuit 61-year-old Murphy has filed against Macy’s and Sunglass Hut, “he was arrested and put into an overcrowded maximum-security jail with violent criminals. While in jail trying to prove his innocence, he was beaten, gang-raped, and left with permanent and awful life-long injuries. Hours after being beaten and gang-raped, the charges against him were dropped and he was released.”

“All of this because a company told the police, based on artificial intelligence, that you were the one who committed terrible crimes,” the lawsuit said.

It was a company that pulled the trigger on this one. An armed robbery on January 22, 2022 saw two men threaten Sunglass Hut employees with guns before walking off with a handful of cash and designer glasses. Another similar robbery at a Macy’s resulted in the two companies working together to identify the suspects.

A mug shot of Murphy, taken nearly 40 years ago, was the supposed key to this investigation. The Sunglass Hut loss prevention officer, Anthony Pfleger, reached out to law enforcement claiming he knew who had committed the crimes. This loss prevention officer also allegedly coached a Sunglass Hut employee to identify Murphy in the law enforcement mugshot lineup.

But there was a major flaw in this investigation — one initiated by a private company and closed out by Houston law enforcement.

At the time of the robbery, Murphy was in Sacramento, California. He didn’t find out about the robbery, or that he’d been blamed for it, until he went to the DMV to renew his driver’s license. He was arrested and held without bond. Despite sending his court appointed lawyer the evidence that exonerated him, he still spent hours in jail.

Law enforcement loves to portray detentions that only last “hours” to be so minimally detrimental to people’s rights as to not be worth of additional attention, much less the focus of a civil rights lawsuit. But it only takes seconds to violate a right. And once it’s violated, it stays violated, even if charges are eventually dropped.

For Murphy, it only took hours for him to be physically and sexually assaulted by other inmates and detainees. And it only to one false match to put him through this hell, despite him being more than 2,000 miles away when the robberies occurred.

And that’s the problem with this tech. It will always be able to ruin someone’s life because it can’t be trusted to deliver accurate matches, especially when it comes to women and minorities. Private companies and law enforcement agencies can point to low false positive percentages all they want, but at the end of the day, people pay a significant, extremely real price for things dismissed as acceptable error rates.

Filed Under: , ,
Companies: sunglass hut

Rate this comment as insightful
Rate this comment as funny
You have rated this comment as insightful
You have rated this comment as funny
Flag this comment as abusive/trolling/spam
You have flagged this comment
The first word has already been claimed
The last word has already been claimed
Insightful Lightbulb icon Funny Laughing icon Abusive/trolling/spam Flag icon Insightful badge Lightbulb icon Funny badge Laughing icon Comments icon

Comments on “Facial Recognition Rings Up Another False Arrest, Leading To The Accused Being Brutalized In Jail”

Subscribe: RSS Leave a comment
37 Comments
Anonymous Coward says:

From the Vice article:

He was arrested and put into an overcrowded maximum-security jail with violent criminals.

Actually, he was held without a bond. He wasn’t accused not even judged, just a suspect in an armed robbery (seems there was no victims).

Sure, AI may be to blame but how can a suspect could ends up in “overcrowded maximum-security jail with violent criminals” in the first place?

Anonymous Coward says:

Re:

Then again, a judge could be convinced to grant bail with little “payoff” (Bitcoin of course)

You can, afterwards, wipe the bitcoin wallet from your hard disk with a good secure disk wiping tool, and they will have no evidence. No evidence = no CASE

Never keep bitcoin in any exchange, always keep it in a private wallet.

Anonymous Coward says:

This is why you want to keep you phone on insane cop proof mode so they can not muscle you into a plea agreement on what they might find, “You plead to this or we will charge you with that”.

I will never give police or prosecutors that chance.

You cannot be charged with any crime in any of Canada’s 14 provinces, mexicos 31 states, or Americas 50 states for locking law enforcement out of your phone.

There is no criminal statute that covers locking law enforcement out of your phone

And “booby trap” mode has been replaced by something even better. Even the encrypted data cannot be accessed without your password.

In short, the data will not even flow when the phone is plugged into a computer and that will stop their asses cold.

My new phone has that, and is even more cop proof than any other phone I have had.

You need to make your phone impossible fur law enforcement to access in case this ever happens to you.

Another option is jam their wifi in case their facial recognition is using wifi. They will never figure out they are being jammed.

Anonymous Coward says:

Re: Re:

That is why you dial up the security on your phone where it is impossible for law enforcement to access it.

There is no law anywhere in either Mexico, Canada, or the USA that makes it illegal lock law enforcement out of your phone.

Never give them a chance to muscle you into a plea agreement

ECA (profile) says:

Re: No

If you want to do something Criminal, YOU DONT USE A PHONE…
If you NEED, get a Throw away phone, WHICH they have been trying to KILL/GET RID OF. There are a few companies that STILL USE THEM.

Like a bank robbery, and the Jude puts up a GEOFENCE search, of all phones in the area at the time…
OYUR PHONE ALWAYS PINGS THE TOWERS.

This comment has been flagged by the community. Click here to show it.

Anonymous Coward says:

I think that is what might have happened some years ago in San Diego when one hotel accused me of having been there before and making trouble.

That was the first time I had been in San Diego since I was 9 years old.

When I go to San Diego now I deploy a jammer when checking into any hotel so that can never happen again

In addition to wifi I jam 900 and 1200 MHz in case they are using older cameras that are operating on those wavelengths.

That happened to me once and I was out the cost of a sea world ticket I could not use because I ended up having to sleep overnight in one parking garage and leave the next morning.

I do not break any FCC rules when I use that jammer to knock out facial recognition so what happened to me at one hotel never happens to me.agsin

There might be state laws on that but i do not break any federal laws using that jammer to prevent their cameras from getting my face

This comment has been flagged by the community. Click here to show it.

This comment has been flagged by the community. Click here to show it.

Anonymous Coward says:

Yes, facial recognition failed here but there’s so much wrong with this case besides facial recognition. There’s the loss prevention employee claiming that the accused committed the crime instead of just providing a lead. There’s the cops arresting the guy instead of making any attempt to confirm the accusation. There’s putting the guy in jail even though he was just accused of retail theft. There’s the violence in jail. AI may be the spark that ignited the powder keg but it’s far from the only problem here.

That Anonymous Coward (profile) says:

This is the part I like most of all…

“This loss prevention officer also allegedly coached a Sunglass Hut employee to identify Murphy in the law enforcement mugshot lineup”

He had such little faith in the facial rec he coached someone how to pick who he thought was the criminal out.

I see that adding at least like 5 0’s to the settlement amount. The fact that he was thousands of miles away during the event should destroy the facial rec firm.

Everyone keeps screaming how AI is going to kill us all, AI isn’t going to hurt anyone… its idiots using its results without any independent thought that is going to kill people.

Facial Rec does not work like in the movies or on tv.
Facial Rec has screwed up repeatably, putting innocent people in jail facing charges for events they had no connection to.

Until someone wins a huge sum of money (and in his 60’s & gang raped while detained is a winner) then people finally might stop pretending tech can’t make a mistake its going to keep happening.

Also perhaps a conversation about how some companies have decided loss prevention is like a real police force should be had.

Wyrm (profile) says:

Poor excuses

Private companies and law enforcement agencies can point to low false positive percentages all they want

Basically, their excuses to people being falsely arrested because of their technology: “You’re just statistically insignificant.”

And law enforcement doesn’t help by not doing basic investigative work before throwing people in jail or prison for “an insignificant amount of time” either.

There are a lot of people who need to learn that real people hide behind these numbers, no matter how low.

Wyrm (profile) says:

Poor comparison

Section 230 doesn’t prevent lawsuits against someone who defamed someone else. It only prevents lawsuits against the tool used to do so.

In this case, by contrast, it is the police fault. A basic check on their suspect would have proven that he was nowhere near the crime scene at the time. There was no reason to jail him, much less in a high (in)security prison with hardened criminals.

The police looked at the specific situation, made a choice and took action, so they bear responsibility. They have a job, and they took the lazy route to jail someone first, ask question later, after deep physical and mental trauma.

If you can’t distinguish this from a platform who can’t examine every piece of content transiting through their mostly automated platform, I can only hope you don’t get in position to legislate about it.

Add Your Comment

Your email address will not be published. Required fields are marked *

Have a Techdirt Account? Sign in now. Want one? Register here

Comment Options:

Make this the or (get credits or sign in to see balance) what's this?

What's this?

Techdirt community members with Techdirt Credits can spotlight a comment as either the "First Word" or "Last Word" on a particular comment thread. Credits can be purchased at the Techdirt Insider Shop »

Follow Techdirt

Techdirt Daily Newsletter

Ctrl-Alt-Speech

A weekly news podcast from
Mike Masnick & Ben Whitelaw

Subscribe now to Ctrl-Alt-Speech »
Techdirt Deals
Techdirt Insider Discord
The latest chatter on the Techdirt Insider Discord channel...
Loading...