Detroit PD Detective Sued For His (Second) Bogus Arrest Predicated On Questionable Facial Recognition Searches

from the fool-yourself-once... dept

On January 9, 2020, facial recognition tech finally got around to doing exactly the thing critics had been warning was inevitable: it got the wrong person arrested.

Robert Williams was arrested by Detroit police officers in the driveway of his home. He was accused of shoplifting watches from a store on October 2, 2018. The store (Shinola) had given Detroit investigators a copy of its surveillance tape, which apparently was of little interest to the Detroit PD until it had some facial recognition software to run it through.

This was the dark, grainy image the Detroit PD felt was capable of returning a quality match:

That picture is included in Williams’ lawsuit [PDF] against the Detroit Police Department. Even in the best case scenario, this picture should not have been uploaded to run a search against. It’s low quality, poorly-lit, and barely shows any distinguishing facial features.

What makes it worse is that all facial recognition AI — across the board — performs more poorly when attempting to identify minorities. That’s the conclusion reached by an NIST study of 189 different algorithms. It’s not just some software. It’s all of it.

The Detroit PD chose to run with that photo. Then it decided the search results it had were close enough to probable cause to effect an arrest, even though the software used stated clearly search results should not be used this way. The search was performed by the Michigan State Police from the grainy image submitted by the Detroit PD. A report was returned but investigators were cautioned against trying to turn this into probable cause:

The following statement appeared prominently on the Investigative Lead Report, in the form shown: “THIS DOCUMENT IS NOT A POSITIVE IDENTIFICATION. IT IS AN INVESTIGATIVE LEAD ONLY AND IS NOT PROBABLE CAUSE TO ARREST. FURTHER INVESTIGATION IS NEEDED TO DEVELOP PROBABLE CAUSE TO ARREST.” The phrase “INVESTIGATIVE LEAD ONLY” was highlighted in red ink.

The report was also light on any other details that might have indicated Robert Williams was actually the shoplifter in question.

The Investigative Lead Report contains neither the “score” generated by the facial recognition system representing the level of confidence that Mr. Williams’s photo matched the probe image, nor the other possible matches that, upon information and belief, should have been returned by the system.

The Detroit Police Department did not attempt to ascertain the “score” generated by the facial recognition search nor request the other possible matches to the probe photo.

Two months after the PD obtained these search results, the investigation was turned over to another detective, Donald Bussa. At the point he assumed control of the investigation, Bussa was supposed to be operating under the PD’s new facial recognition policy that acknowledged the limitation of the tech and stated search results would need to be peer reviewed to ascertain their accuracy. This didn’t happen.

Defendant Bussa, however, ignored the new policy. Even though the facial recognition search “identifying” Mr. Williams as the shoplifter was generated by a woefully substandard probe image and had never been peer reviewed by DPD officers, as required by the new policy, Defendant Bussa decided to rely on the lead anyway.

Bussa assembled a “six pack” of suspect photos that contained an image taken from Williams’ expired drivers license. (Before this investigation took place, Williams had secured a new license and an updated photo.) He tried to speak to the staff at Shinola but management refused to cooperate, stating that it was not interested in having its employees appear in court. Unable to speak to the sole eyewitness who had actually conversed with the shoplifting suspect, Detective Bussa decided to bypass Shinola completely.

Defendant Bussa then arranged to conduct a six-pack photo identification with Katherine Johnston. Ms. Johnston, then employed by Mackinac Partners, was contracted by Shinola for loss prevention services.

Defendant Bussa had no legitimate basis whatsoever for asking Ms. Johnston to participate in an identification procedure. Ms. Johnston was not an eyewitness. Ms. Johnston was not in the Shinola store at the time of the incident and has never seen Mr. Williams or the alleged shoplifter in person. Indeed, Ms. Johnston’s sole relation to the incident was that she had watched the same low-quality surveillance video that Detective Bussa possessed.

Bussa sent Detective Steve Posey out with the loaded “six pack” to pretty much guarantee Williams was selected as the prime suspect.

The photo array was not a blind procedure—Posey knew that Mr. Williams was the suspect. Indeed, Posey’s sheet was nearly identical to that given to Ms. Johnston, except that Mr. Williams’s name was printed in red while all other names were printed in black.

Ms. Johnston identified Mr. Williams’s expired license photo as matching the person she had seen in the grainy surveillance footage, and answered the question “Where do you recognize him from?” with “10/2/18 shoplifting at Shinola’s Canfield store.”

With that, Bussa went out to get an arrest warrant and the rest is facial recognition history. He secured this warrant by omitting some key details, like the fact the suspect was picked out of a facial recognition database using a low quality image. It also did not note that the person who picked Williams out of Bussa’s loaded lineup wasn’t even at the store the day the shoplifting happened. And it didn’t mention Bussa’s bypassing of Detroit PD policies that were put in place to prevent exactly this sort of false identification.

The lawsuit also points out that the same software and the same two detectives were involved in another false arrest — one that occurred five months before the PD arrested Williams. Detective Bussa and Detective Posey used unvetted search results to arrest Michael Oliver for an assault he didn’t commit. Even if the facial recognition software had done its job accurately (which it didn’t), the tech would not have noticed something far more obvious: the suspect’s arms (as captured in the phone recording) were unmarked. Michael Oliver’s arms are covered with numerous tattoos.

Williams alleges a long list of violations he’s hoping to hold Detective Bussa (and his supervisor) accountable for. It’s going to be pretty difficult for the detective to argue he operated in good faith in the Williams arrest. After all, he’d already followed the same broken process to falsely arrest someone else months earlier. Then he ignored the PD’s policy on facial recognition tech. He also ignored the big, bold warning printed across the search results he obtained from the State Police. And none of this information — which would have undercut his probable cause assertions — made its way into his warrant request.

Any reasonable officer would know a lot of what Detective Bussa did was wrong. But Bussa would know this more than even the most reasonable of officers because it wasn’t the first time he’d screwed up.

Filed Under: , , , , , , , ,

Rate this comment as insightful
Rate this comment as funny
You have rated this comment as insightful
You have rated this comment as funny
Flag this comment as abusive/trolling/spam
You have flagged this comment
The first word has already been claimed
The last word has already been claimed
Insightful Lightbulb icon Funny Laughing icon Abusive/trolling/spam Flag icon Insightful badge Lightbulb icon Funny badge Laughing icon Comments icon

Comments on “Detroit PD Detective Sued For His (Second) Bogus Arrest Predicated On Questionable Facial Recognition Searches”

Subscribe: RSS Leave a comment
19 Comments
This comment has been deemed insightful by the community.
Anonymous Coward says:

Re: The least he could do...

luv how the judge who issued the actual Arrest Warrant here is given a total pass on any wrongdoing.

the judiciary is supposed to protect the public against this type of police misconduct.

Judges are supposed to know much more about law and rules of evidence and probable cause than cops.
This judge rubber stamped a highly questionable warrant application and is the primary guilty party.

And where was the Detroit City Attorney office in this process? They are the ones who would have to prosecute the case in court, but can’t be bothered with any details establishing the case in the first place?

The criminal justice system is corrupt, aside from any technical shortcomings with facial-recognition. That’s the main lesson here.

Anonymous Coward says:

Re: Re: The least he could do...

Judges can be pretty bad and rubber-stampey, but none interrogate every last detail of a warrant. "Is this person who identified the suspect a person who was actually present during the commission of the crime?" is not something they should even have to ask. "Did this occur on Earth?"

If cops are lying on warrant, there should be consequences, and judges should be among those free to pursue or deliver outright such consequences. Sometimes they probably can.

Anonymous Coward says:

Re: Re: Re: The least he could do...

…judges are under no obligation or legal pressure to approve any police warrant application.

If a judge has the slightest reasonable doubt about the validity of a warrant application, that judge should legally reject that application.
Judges are not required to question the cop and help him craft an acceptable application.

This was an alleged petty misdemeanor anyway — guess there were no serious crimes in Detroit for this cop to investigate.

This comment has been deemed insightful by the community.
Tanner Andrews (profile) says:

Re: Re: The least he could do...

the judiciary is supposed to protect the public against this type of police misconduct

Not exactly. In many jurisdictions, the cops know which judges will sign off on warrants without asking annoying questions. They tend to present warrant applications to those judges.

This practice avoids inconvenience for the police. We hope that it also prevents the public from having any foolish expectation of protection from government overreach.

This comment has been deemed insightful by the community.
Whoever says:

Perjury?

and answered the question “Where do you recognize him from?” with “10/2/18 shoplifting at Shinola’s Canfield store.”

Ms. Johnston lied under oath, since she could not have recognized anyone from the shoplifting incident as she was not present.

Are there no consequences for this?

This comment has been deemed insightful by the community.
DocGerbil100 (profile) says:

Beyond the technology angle, I didn’t think too much about these cases, when TD reported on them previously. Looking again today, a few rather unpleasant thoughts on this spring definitely to mind.

Now that I consider the matter, I can only presume these cases have drawn a measure of heightened scrutiny because of the deficiencies of the technology used. However, if they’re this sloppy all the time — and always grabbing faces that just happen to vaguely fit the crime — how many other supposed criminals are still in jail, without such an obvious investigatory failing that would let them push back against similarly-false allegations?

Following on from that — and considering that both the suspects in these known cases are black — I have to wonder just how many of the unknown number of falsely-accused suspects are also from ethnic backgrounds. It’s hardly a revelation to say that the job of policing attracts all sorts of people, from the very best to the very worst — including some of the most virulent racists in society.

I don’t know how to locate career histories for individual police officers, but if what they’ve done in these two cases is any indication, then they could well have made decades-long careers for themselves, just by mostly arresting and charging semi-random suspects, purely on the basis of their skin colour.

I’d hope that Detroit PD would look very carefully at the history of these two officers and take the opportunity to get its house in order. Sadly, with American policing being what it is — especially the world-infamous Detroit PD — I doubt this will actually happen.

Inferences and suppositions about two questionable officers can only be that — but all signs suggest to me that there’s far more here that’s rotten than just the technology.

This comment has been deemed insightful by the community.
That One Guy (profile) says:

Re: In this case the machine did a much better job though

Yeah, that wasn’t blind confidence as the software explicitly said that the match wasn’t good enough for an arrest, this was someone who ‘knew’ he had the right suspect and was going to stack the deck as much as needed to see them arrested and punished.

Scary Devil Monastery (profile) says:

Re: Re: The modern day version of 'reasonable cause on demand'

…with the added "benefit" that facial-recognition tech is ubiquitously and infamously bad at differentiating between faces in darker shades. I mean when you have a 50% chance of not seeing the difference between Prince and Janet Jackson there’s going to be a whole lot of false positives conveniently floating around.

Yeah, the concept of contrast loses a lot of leverage if the skin is dark enough shadows blend in a crappy camera in poor lighting. The solution to that should either be building any camera used for facial recognition to specs rising above this problem or have the facial recognition algorithm just say "ambiguous results, I got nothing" when it actually can’t clearly recognize the markers.

Anonymous Coward says:

Drug Sniffing dogs

Seems like this tech is a similar tool to drug sniffing dogs, just a way to generate probable cause and make false arrests. The officer was probably approached by a "company representative" and pushed into "proving the software works".

"Don’t worry, what could go wrong? We got your back".

Look into the bozos finances.

ECA (profile) says:

Anyone?

Anyone seen some of the Old camera pictures from Fixed focus, out of exposure range, Sun in wrong position Type pictures?
Anyone ever seen The First Digital camera pic’s and vid?
Anyone looked at Even the current range of security cameras, and wonder WTH?
There is 1 real nice trick about many digital cameras. IR lighting.
Every outdoor camera and most indoor will see in IR. and many have IR LED’S. For a reason. and those that Dont see IR have a filter on them so they cant see it. the sensors are Very sensitive to IR, it even damages the sensor over time. But as an Invisible Light it can see Very well. So Why in Hell dont security camera’s USE THIS FEATURE? It makes the Pic in Black and white, and very detailed.
But people think COLOR is best, and dont get the idea that you can take pictures in BOTH at the same time.

ECA (profile) says:

Re: Re: Anyone?

Probably means they are wearing a Very thin synthetic, worth <$1, and sold to them for $10-100.
Its not a porn channel, and the recordings are only used in the time of a crime.
and I really dont think IR will penetrate, much of anything.
https://www.researchgate.net/publication/258196189_Transmittance_of_Infrared_Radiation_Through_Fabric_in_the_Range_8-14_m

Consider this is 1 way, and the Loss of Wavelength(65%), even thru 1 layer would make the reflection of Skin almost impossible.
Let alone the absorption of the skin itself, would make this beyond 0. NOW there is a medical thing about this, as you can see Skin damage with this, esp the types that can lead to cancers. With the correct lighting.

Leave a Reply to Anonymous Coward Cancel reply

Your email address will not be published. Required fields are marked *

Have a Techdirt Account? Sign in now. Want one? Register here

Comment Options:

Make this the or (get credits or sign in to see balance) what's this?

What's this?

Techdirt community members with Techdirt Credits can spotlight a comment as either the "First Word" or "Last Word" on a particular comment thread. Credits can be purchased at the Techdirt Insider Shop »

Follow Techdirt

Techdirt Daily Newsletter

Ctrl-Alt-Speech

A weekly news podcast from
Mike Masnick & Ben Whitelaw

Subscribe now to Ctrl-Alt-Speech »
Techdirt Deals
Techdirt Insider Discord
The latest chatter on the Techdirt Insider Discord channel...
Loading...