Congress Members Want Answers After Amazon's Facial Recognition Software Says 28 Of Them Are Criminals

from the but-they're-all-crooks-amirite dept

Hey, American citizens! Several of your Congressional representatives are criminals! Unfortunately, this will come as a completely expected news to many constituents. The cynic in all of us knows the only difference between a criminal and a Congressperson is a secured conviction.

We may not have the evidence we need to prove this, but we have something even better: facial recognition technology. This new way of separating the good and bad through the application of AI and algorithms is known for two things: being pushed towards ubiquity by government agencies and being really, really bad at making positive identifications.

At this point it’s unclear how much Prime members will save on legal fees and bail expenditures, but Amazon is making its facial recognition tech (“Rekognition”) available to law enforcement. It’s also making it available to the public for testing. ACLU took it up on its offer, spending $12.33 to obtain a couple dozen false hits using shots of Congressional mugs.

In a test the ACLU recently conducted of the facial recognition tool, called “Rekognition,” the software incorrectly matched 28 members of Congress, identifying them as other people who have been arrested for a crime.

The members of Congress who were falsely matched with the mugshot database we used in the test include Republicans and Democrats, men and women, and legislators of all ages, from all across the country.

The bad news gets worse.

The false matches were disproportionately of people of color, including six members of the Congressional Black Caucus, among them civil rights legend Rep. John Lewis (D-Ga.).

And here’s the chilling lineup of usual suspects according to Amazon’s Rekognition:

Using 25,000 publicly-available mugshots and Rekognition’s default settings, the ACLU picked up a bunch of false hits in very little time. This is only a small portion of what’s available to law enforcement using this system. Agencies have access to databases full of personal info and biometric data for hundreds of thousands of people, including people who’ve never been charged with a crime in their lives.

The obvious downside to a false hit is, at minimum, the unjustified distribution of identifying info to law enforcement officers to confirm/deny the search results. At most, it will be the loss of freedom for someone wrongly identified as someone else. Recourse takes the form of lawsuits with a high bar for entry and slim likelihood of success, thanks to several built-in protections for law enforcement officers.

Amazon continues to market this system to law enforcement agencies despite its apparent shortcomings. Very little has been written about the successes of facial recognition technology. There’s a good reason for this: there aren’t that many. There certainly haven’t been enough to justify the speedy rollout of this tech by a number of government agencies.

This little experiment has already provoked a response from Congressional members who are demanding answers from Amazon about the ACLU’s test results. Amazon, for its part, claims the ACLU’s test was “unfair” because it used the default 80% “confidence” setting, rather than the 95% recommended for law enforcement. The ACLU has responded, noting this is the default setting on Rekognition and nothing prompts the user — which could be a law enforcement officer — to change this setting to eliminate more false positives. In any event, at least Congress is talking about it, rather than nodding along appreciatively as federal agencies deploy the tech without public consultation or mandated privacy impact reports turned in.

Filed Under: , , ,
Companies: aclu, amazon

Rate this comment as insightful
Rate this comment as funny
You have rated this comment as insightful
You have rated this comment as funny
Flag this comment as abusive/trolling/spam
You have flagged this comment
The first word has already been claimed
The last word has already been claimed
Insightful Lightbulb icon Funny Laughing icon Abusive/trolling/spam Flag icon Insightful badge Lightbulb icon Funny badge Laughing icon Comments icon

Comments on “Congress Members Want Answers After Amazon's Facial Recognition Software Says 28 Of Them Are Criminals”

Subscribe: RSS Leave a comment
88 Comments
Anonymous Anonymous Coward (profile) says:

Goose, Gander, Good, can you feel now?

I had been dreaming about Congresscritters, Judges, or high level law enforcement officials, or even high level bureaucrats being caught up in some of the antics law enforcement exhibit these days. This will do for a start.

It is too bad that they weren’t actually taken into custody and held for some time (it would be illegal to arrest a congress person on their way to a vote) in order for someone up there in the ethereal levels of government to take notice. If they are as vulnerable as the rest of us, they might put aside their quest for power, and do something for the rest of us.

I have little hope, but this might give them a nudge in the right direction.

Uriel-238 (profile) says:

Re: Famous people getting profiled

I’ve seen the effect more often from famous people getting caught up in police shenanigans when they’re just trying to duck the crowds to do some light shopping.

The old Henry V trick of kings wandering about engaging their subjects while disguised in order to pick up the pulse of the common public has appeared in history a few times. It would make sense for our officials to try it occasionally if they cared about public opinion.

alternatives() says:

Re: Re: Famous people getting profiled

While the comment section of the local newspaper is a poor place to get public opinion I’m betting the social media analytics can be bundled up in such a way to get the pulse on the under 50 crowd and the better off over 50 crowd. Better than wandering the street or some kind of 2 hour “listening session” in a town.

Anonymous Coward says:

Re: Need more info

The bad news gets worse.

> The false matches were disproportionately of people of color, including six members of the Congressional Black Caucus, among them civil rights legend Rep. John Lewis (D-Ga.).

Facial recognition software has no race bias. If the results are "disproportionate" then the database of facial structures must represent the biased class in greater numbers. Don’t blame the tool.

Anonymous Coward says:

Re: Re: Need more info

In theory, it should have no racial bias.

However, in practice, it assigns a probability that two images are of the same face based on how likely it is that two people would have the same [insert list of facial features].

If the algorithm fails to account for multicollinearity (that is, the fact that two data points often show up together and thus the existence of the second doesn’t prove much once you know the existence of the first), then it can absolutely be racially biased. A poorly programmed algorithm, trained mostly on white faces, could easily conclude that two black people who share features uncommon to white faces, but common among black faces, look enough alike to be flagged as the same person. To have, really, a bias that all black people look alike, which would be incredibly racist.

Does this algorithm have that kind of bias? I don’t have enough information to know. It’s certainly happened in the past.

But categorically ruling it out seems foolish. And contending that it’s because the sample is larger is especially so: these kinds of algorithm are more accurate when they have more data to train themselves on, so underrepresented racial groups are more likely to trigger false positives than overrepresented ones.

David says:

Re: Re: Need more info

If the results are "disproportionate" then the database of facial structures must represent the biased class in greater numbers. Don’t blame the tool.

So tacking on "with a computer" disperses all blame? A lot of prejudice is statistically valid and individually unjust. As long as our judicial system punishes people individually rather than for having features correlated to criminals, being racially unbiased, treating everybody as an individual until proven differently, is hard work. It’s also necessary in order not to cause self-fulfilling prophesies and have society progress as a whole, reward individual virtue, and be visible in similar ways to all constituents so that they can vote and campaign in a qualified manner.

"with a computer" does not magically disperse the bias reflecting our current society and its history. Nor does "with statistics".

Anonymous Coward says:

Re: Need more info

Photographic skin color will change according to lighting, being affected by both brightness and wavelength, so it would not be a reliable indicator, even without makeup. Unless of course there is some sort of built in correction to adjust for ambient lighting conditions, a rather complex solution..

False positives would be expected to be “disproportionately of people of color” whenever that segment of the population is disproportionately criminal.

The widespread adoption of facial recognition by police could spark a boom in “defensive” plastic surgery, especially of the extreme variety. While the “old” Michael Jackson might have resembled many common criminals, the “new” Michael Jackson really didn’t resemble any other human on the planet, making any potential false positive extremely unlikely.

https://www.forbes.com/sites/mzhang/2015/07/01/google-photos-tags-two-african-americans-as-gorillas-through-facial-recognition-software/

Will B. says:

Re: Re: Need more info

“False positives would be expected to be “disproportionately of people of color” whenever that segment of the population is disproportionately criminal.”

…or if that segment of the popukation is disproportionately criminalIZED. Consider: they are matching to convicted criminals, right? So if, say, black people are more likely to be convicted of the same crime than white people? If, just for a hypothetical, black people are more likely to be arrested in the first place on marijuana-related charges, despite drug use actually being relatively even among most segments of society? Or, say, minorities being less likely to get out of convictions due to having poorer legal representation and more bias against them in a courtroom setting?

ShadowNinja (profile) says:

Re: Need more info

The thing is some people, especially those with white skin, can change their skin color through tanning and sun burns much easier.

Me and my brother grew up thinking for a while that our grandfather was black because of what a dark tan he always had year round. From what my mother said it built up over time because of the lack of sun screen protection products that worked back in his days.

Anonymous Coward says:

Re: Need more info

Most facial recognition algorithms use machine learning and it is difficult to make definitive statements about what does and doesn’t impact the result.

One known issue is that some algorithms use the light reflecting off the nose as a factor and darker skin results in less reflectivity and thus less variance in the reflection. In general, facial recognition algorithms consistently have the best results on the majority ethnicity of the country where they were designed.

Anonymous Coward says:

Re: Re: Need more info

The data source being compared to was mugshots. But the facial recognition algorithms may have been trained on a different dataset, which would be what controls how they attempt to match faces. Likely sources would be a constructed representative sample of the US(which would be majority-white) or scraped from taggrd photos on social media (which would be even more white). The algorithms generated by doing that would have the highest accuracy on white faces.

Thad (user link) says:

Re: Re: Need more info

Already addressed this downthread, but here it is again:

Yes, yes, the "lol everyone in Congress is a crook" joke has been made several times already in this thread. But no, not everyone in Congress is a wealthy crook. Raul Grijalva’s net worth is estimated at under $300,000. That’s not chump change, but it hardly makes him one of the wealthiest people in the country. It doesn’t even make him one of the wealthiest people in Tucson.

It would probably be helpful to actually look up the people who have been snagged by this racially-biased mismatch and find out who they are and what they’re about rather than make ignorant generalizations about people in Congress. If you seriously believe that John Lewis’s real problem with being racially profiled is that he doesn’t want to be mistaken for a poor person, then congratulations on having no fucking idea who John Lewis is.

Anonymous Coward says:

5 of 435 is 1.2%. The ACLU is a few hundred trouble-makers

out of 200 million Americans#. So this non-story is at best driven by 0.000-something% and then 1.2%, while you ignore a hundred items of high importance. Typical Techdirt Tempest-in-a-Thimble.

Five members of Congress … Jimmy Gomez, John Lewis, Luis Gutierrez and Mark DeSaulnier and Sen. Edward Markey

And anyway, WHAT THE HELL IS THE POINT OF CRITICIZING BETA SOFTWARE?

Next story, please. Probably have to wait 2 hours for another ginned-up fanboy-feeding re-write from several days ago.

Using reasonable definition of "American": doesn’t include you antis who want it changed to European feudalism or globalism, incoherent malcontents-without-a-cause (not even up to rebels, just grrr and stuff), nor those here illegally.

Anonymous Coward says:

Re: 5 of 435 is 1.2%. The ACLU is a few hundred trouble-makers

1.2% nationally? 7.2M people…

Societies prosecute social outliers. The peredo principle scales, even if you prune data. Which is to say that there will ALWAYS be outliers, even when you get rid of all of the current people you thought were outliers.

People who think they haven’t perpetrated at least one felony in their life, haven’t read much law. What we’re talking about here is a technical system, that is objectively evaluating candidates for prosecution within a subjective sociological system, that has a false positive ratio of 100%. Because all of us are criminals.

There is no question that this is going to go completely off the rails. There is a ratio that the accuracy of law enforcement can not scale beyond without serious consequences. The law is simply not fit to become cybernetically enhanced, regardless of how good the tech is. As the pressure mounts, the most likely outcome, is that descrimination on non-legal basis’ will become the pressure valve. Racism, sexism, etc. Will be determinant more than the crime, because the field of prosecution will be abundant, and the prosecutors will therefore be compelled to choose. And you can be assured, they won’t choose people like them.

Uriel-238 (profile) says:

Re: Re: "not fit to become cybernetically enhanced"

The FBI was formed to target the 20th century mobs and foreign espionage elements on US soil. After the cold war ended, it has not transitioned well, which is why now it singles out mentally disabled people and frames them for terrorist-like activities.

Similarly, cannabis is becoming decriminalized and cocaine and meth have dropped off. Heroin is on the rise thanks to the opiate crisis, but arresting people who were hooked by their own doctors doesn’t look good. So the DEA has also turned to entrapment and busts with false evidence.

Part of the problem is that we have these agencies which were meant to attack certain types of crime. But if they succeed in actually reducing that crime (or the crime reduces on its own due to other circumstances) then they lose sweet, sweet budget money, and they have to maintain a high conviction rate, even if it’s manufactured.

alternatives() says:

Re: Re: 5 of 435 is 1.2%. The ACLU is a few hundred trouble-makers

At one time a package called rekognition had a free API for low volume testing. And it got closed to that API I want to say 3-5 years ago. Remember too the US Army had facial recognition as part of its insurgent database fighting efforts in the (rounding up) 2 decade Afghanistan conflict on iPads. That military data strikes me as a FOIAable thing and I’m betting someone has that as part of their FOIA collection efforts.

Anonymous Coward says:

How does this compare to humans?

Too bad the ACLU didn’t provide side-by-side comparisons. It’s hard to judge this work without context. Amazon says they’re 80% a congressperson matches a mugshot, but what result would we get from humans? Would 80% of them also say these two photos are the same person, if shown together? That would hint at a very different problem: that the software isn’t buggy, but the entire idea of comparing huge datasets is flawed.

Anonymous Coward says:

Re: How does this compare to humans?

The ACLU in the last 40 years has been made up from those lawyers who were rejected from established firms or incapable of creating their own firms. They would like to make out like they are fighting for constitutional rights, but how many 2nd amendment cases have they taken on? Zero

Anonymous Coward says:

Re: Re: Re:

Huh? So a 95% confidence level would result in anywhere from 9.5 in 10 to 10 in 10 false positives? I think your mental math is incorrect.

80% confidence means that for every “positive” there is a 20% chance that it’s a “false positive”. That, on average, is a 20% error rate or 2 in 10 false positives.

Rekrul says:

Re: Re:

So if they increase the confidence level from 80% to 95%, by my math that should reduce false positives by a factor of 4. That means that "only" about 7, instead of 28, representatives would have been identified as criminals. Or is my math wrong?

No, the confidence level is what percentage of a person’s face matches that of another photo. Think of it this way;

If you were to tell a program to match words with 60% the same letters, it would match the words "Aloha" and "Alone", however if you increased the threshold to 80% or higher, the words would no longer match, because only 60% of the letters are the same.

Thad (user link) says:

Re: Google Face Recognition

  1. Amazon, not Google.
  2. Yes, yes, the "lol everyone in Congress is a crook" joke has been made several times already in this thread. But no, not everyone in Congress is a wealthy crook. Raul Grijalva’s net worth is estimated at under $300,000. That’s not chump change, but it hardly makes him one of the wealthiest people in the country. It doesn’t even make him one of the wealthiest people in Tucson.
Anonymous Coward says:

Say what you want about law enforcement having access to huge swaths of automatically curated video; I’m much more worried about Amazon having access to criminal records.

At best Amazon uses it for advertising purposes (acquitted of manslaughter? Check out our Amazon Basics 22-piece kitchen knife set!) At worst people with the same name as felons can’t shop online.

Anonymous Coward says:

While law enforcement would not be a receiver of lawsuits, Amazon surely would and this would be a huge liability for Amazon if anyone was falsely arrested by an officer using this system. This would result in one the largest class action lawsuits ever brought together.

I’m sure that Amazon’s CEO will be appearing in a congressional hearing over this once they learn of what happened.

Personanongrata says:

Precognition, Phrenology and Politicans

Congress Members Want Answers After Amazon’s Facial Recognition Software Says 28 Of Them Are Criminals

Mayhap Amazon’s Facial Recognition Software also has a precognition feature built in and is able to discern which Congress Member will turn to a life of political crime in the future.

Anonymous Coward says:

I’m guessing no one else noticed that only 25k publicly available mugshots were used.

I feel pretty sure that there are way more that 25k mugshots that could be used even if only publicly available ones were all that LEO had access to.

Makes you wonder how many would have been flagged if a larger set than 25k had been used.

Jim P. (profile) says:

False Identification

Rather than a boon for freedom, this will end with the software being modified so sample pictures can be added and tagged to prevent them from coming up as possible suspects.

Congress will gleefully carve an exemption for itself, and maybe judges, “senior” government officials and certain other “elite” just as they have done for TSA and other measures by passing a law requiring such processes be built in to any scanning systems.

Uriel-238 (profile) says:

Re: Criminal types

If you’re training the system on mugshots, then what is it learning to recognize? Criminal types.

That assumes that those people who end up processed in the legal system are actually criminal. There’s a lot of evidence that a significant number of arrests and convictions may be false. (We have no system to test it, and the current prison system is very resistant to challenges to convictions.)

Though yes, it should belie any patterns of profiling that the police use in choosing their suspects.

Add Your Comment

Your email address will not be published. Required fields are marked *

Have a Techdirt Account? Sign in now. Want one? Register here

Comment Options:

Make this the or (get credits or sign in to see balance) what's this?

What's this?

Techdirt community members with Techdirt Credits can spotlight a comment as either the "First Word" or "Last Word" on a particular comment thread. Credits can be purchased at the Techdirt Insider Shop »

Follow Techdirt

Techdirt Daily Newsletter

Ctrl-Alt-Speech

A weekly news podcast from
Mike Masnick & Ben Whitelaw

Subscribe now to Ctrl-Alt-Speech »
Techdirt Deals
Techdirt Insider Discord
The latest chatter on the Techdirt Insider Discord channel...
Loading...