GAO's Second Report On Facial Recognition Tech Provides More Details On Federal Use Of Clearview's Unvetted AI

from the still-greater-than-zero-agencies,-unfortunately dept

A couple of months ago, the Government Accountability Office completed the first pass of its review of federal use of facial recognition technology. It found a lot to be concerned about, including the fact that agencies were using unproven tech (like Clearview’s ethical nightmare of a product) and doing very little to ensure the tech was used responsibly.

Some agencies appeared to have no internal oversight of facial recognition tech use, leading to agencies first telling the GAO no one was using the tech, only to update that answer to “more than 1,000 searches” when they had finished doing their first pass at due diligence.

A more complete report [PDF] has been released by the GAO, which includes answers to several questions asked of federal agencies using the tech. Unfortunately, it confirms that many agencies are bypassing what little internal controls are in place by asking state and local agencies to run searches for them. DHS entities (CBP, ICE) did the most freelancing using downstream (governmentally-speaking) databases and tech.

For whatever reason, CBP and ICE (which have access to their own tech) are using agencies in Ohio, Nebraska, Michigan, Kansas, and Missouri (among others) to run searches for criminal suspects and to “support operations.” A whole lot of non-border states are allowing agencies to bypass internal restrictions on use of the tech.

And there’s a whole lot of Clearview use. Too much, in fact, considering the number of agencies using this highly questionable product exceeds zero.

The US Air Force says it engaged in an “operational pilot” beginning in June 2020, utilizing Clearview to run searches on biometric information gathered with “mobile biometric devices, including phones.”

The Inspector General for the Department of Health and Human Services also apparently used Clearview. The report says the HHS OIG “conducted an evaluation of the system in an attempt to identify unknown subjects of a criminal investigation.” Experimentation, but with the added bonus of possibly infringing on an innocent person’s life and liberty!

Also on the list are CBP, ICE, and US Secret Service. ICE appears to be the only agency actually purchasing Clearview licenses, spending a total of $214,000 in 2020. The CBP, however, is getting its Clearview for free, utilizing the New York State Intelligence Center’s access to run searches. The Secret Service gave Clearview a test drive in 2019 but decided it wasn’t worth buying.

The Department of the Interior says it has both stopped and started using Clearview. Under “Accessed commercial FRT [facial recognition technology] system, the DOI claims:

Interior uses Clearview AI to verify the identity of an individual involved in a crime and research information on a person of interest. Interior may submit photos (e.g., surveillance photos) for matching against the Clearview AI’s repository of facial images from open sources. U.S. Park Police reported it stopped using Clearview AI as of June 2020.

But under “New access to commercial FRT system,” the DOI states:

Interior reported its U.S. Fish and Wildlife Service began using a trial version of Clearview AI in May 2020, and purchased an annual subscription in June 2020.

The DOI is both a current and former customer, depending on which component you speak to, apparently.

The DOJ is an apparent believer in the power of Clearview, providing access to the ATF, DEA, FBI, and US Marshals Service. But there must be a lot of sharing going on, because the DOJ only purchased $9,000-worth of licenses.

Interestingly, the DOJ also notes it received an upgrade from Axon, which provides body-worn cameras. Axon has apparently added a new feature to its product: “Facial Detection.” Unlike facial recognition, the product does not search for faces to run against a biometric database. Instead, the system “reviews footage” to detect faces, which can then be marked for redaction.

This FRT-related expenditure is also interesting, suggesting the DOJ may actually be trying to quantify the effectiveness of body cameras when it comes to deterring officer misconduct.

DOJ reported that it awarded an $836,000 grant to the Police Foundation for the development of techniques to automate analysis of body worn camera audio and video data of police and community interactions. In particular, these techniques could (1) allow an evaluation of officers’ adherence to principles of procedural justice and (2) validate the ratings generated by the automated process using a randomized control trial comparing software ratings of videos to evaluations performed by human raters under conditions of high and low procedural justice.

Finally, there’s this unnecessarily coy statement by the IRS about its use of commercial facial recognition systems.

A third-party vendor performed facial recognition searches on behalf of the IRS for domestic law enforcement purposes. Additional details on the search are sensitive.

Whatever. It’s probably Clearview. And if it isn’t, it probably will be at some point in the near future, given federal agencies’ apparent comfort with deploying unproven, unvetted tech during criminal investigations.

The report is probably the most comprehensive account of facial recognition tech by the federal government we have to work with at the moment. It shows there’s a lot of it being used, but it hasn’t become completely pervasive. Yet. Most agencies use the tech to do nothing more than identify employees and prevent unauthorized access to sensitive areas. Some agencies are digging into the tech itself in hopes of improving it. But far too many are still using a product which has been marketed with false statements and has yet to have its accuracy tested by independent researchers. That’s a huge problem, and, while it’s not up to the GAO to fix it, the report should at least make legislators aware of an issue that needs to be addressed.

Filed Under: , , ,
Companies: clearview

Rate this comment as insightful
Rate this comment as funny
You have rated this comment as insightful
You have rated this comment as funny
Flag this comment as abusive/trolling/spam
You have flagged this comment
The first word has already been claimed
The last word has already been claimed
Insightful Lightbulb icon Funny Laughing icon Abusive/trolling/spam Flag icon Insightful badge Lightbulb icon Funny badge Laughing icon Comments icon

Comments on “GAO's Second Report On Facial Recognition Tech Provides More Details On Federal Use Of Clearview's Unvetted AI”

Subscribe: RSS Leave a comment
Upstream (profile) says:

Bad in several ways

Tim points out a couple of the particularly bad aspects of facial recognition technology use in this article: that it is unvetted (he’s being polite, since actually it is known to be very unreliable) and that it’s use by Federal agencies is often via the proxies of state or local law enforcement agencies.

These aspects are particularly bad, since unreliable tech can only make a bad law enforcement system worse, and the Federal / local "partnerships" are very effective in thwarting what little accountability may exist in either realm.

But there are other bad aspects of the tech, as well. It is known to have significant racial / ethnic "issues." Facial recognition AI essentially says "Those [insert non-white racial / ethnic group here] all look alike to me." This has horrible implications for a criminal legal system that is already biased against non-white racial and ethnic minorities. Now the cops can just blame it on the "computer," again thwarting what little accountability may exist.

Another big problem with the facial recognition AI is the base rate fallacy, which is a somewhat complex and completely non-intuitive concept in probability. It basically means that even some system with an ostensibly high accuracy rate can still be wrong very often. This is completely unacceptable in a law enforcement situation, where people’s rights, freedom, and even lives, are on the line. It is also easily and completely avoidable: just don’t use the crappy facial recognition AI.

It is good that this GAO report came out. It would be better if it were used to stamp out the malignancy of facial recognition AI everywhere it is being used.

Anonymous Coward says:

Re: Bad in several ways

…aside from the many technical and legal problems with facial-recognition, one needs to step back and see it as a symptom of a much more dangerous U.S. government stampede to a full Surveillance-State against the American people.

All levels of American government are constantly and diligently pursuing new technical and political methods of closely monitoring all citizen behavior at all times.

The government objective is full control of the population (of course, to make a "better society" to their liking)

Are the people running our government really our public-servants or our masters ?

James Burkhardt (profile) says:

Re: Re:

Writing out "conditions of" immediately proceeding your quote makes the quote much harder to parse. Lets get context, since the context is always the best place to start.

According to

"Procedural justice refers to the idea of fairness in the processes that resolve disputes and allocate resources. Procedural justice speaks to four principles, often referred to as the four pillars: fairness in the processes, transparency in actions, opportunities for voice, and impartiality in decision making"
(Formatting adjusted for clarity)

And the full context for the quote you make (which is super important to understand its meaning):

DOJ reported that it awarded an $836,000 grant to the Police Foundation for the development of techniques to automate analysis of body worn camera audio and video data of police and community interactions. In particular, these techniques could…..(2) validate the ratings generated by the automated process using a randomized control trial comparing software ratings of videos to evaluations performed by human raters under conditions of high and low procedural justice.
(Emphasis mine)

In context, the "circumstances of [high] procedural justice" would therefore mean when departments and cities implement fair, transparent and impartial processes that give adequate means for the public to have a voice.

Since procedural justice is about local rules, how much procedural justice exists will vary from region to region with laws, policies, and practices. Some areas have conditions of a high level of procedural justice, or "high procedural justice", and some areas have low levels of procedural justice, or "low procedural justice". The quote implies that they are looking for a system that can help them assess how local procedural justice affects the assessment of body worn camera footage, or to use body cam footage to assess how local procedural justice affects the behavior of officers.

You both changed the context of the quote making it harder to parse, and seem unclear on the definition of procedural justice used by the DOJ.

sumgai (profile) says:

Re: Re: Re:

First, thanks James.

When I went to law school something like 48 years ago, we didn’t even think of differentiating any level of procedural justice, it was just a given that fairness and transparency were part and parcel of the process. Now, at an advanced age, I have to agree that what I got was a pie-in-the-sky view of how things should be, not how that are (and likely, not how there were at that time, either).

The article’s line in question could’ve been more clearly written. "Conditions" doesn’t really connote why there might be different levels, as the words "high" and "low" easily confer on whatever words might follow. You were much more concise with "local", etc. but now we’re using high and low to denote a level of quality-of-service, not something like "formal" versus "informal" (which is how I took it in the first place).

Shame that has to happen, but it’s gonna be a hard row to hoe, getting everyone on the same page.

sumgai (profile) says:

Did anyone else pickup on the fact that a report a few months ago said that many States, particularly New York, are claiming in their accountability reports that they are offloading their searches to the Feds, naming many of these same three-letter agencies that just got through claiming the exact opposite?

Just another excuse to pull out of the bag of tricks, when "those accountability assholes come snooping around".

ECA (profile) says:

Since computers came around

They have been trying to do a few things for along time.

  1. Facial recog.
  2. an interstate recog/records ability(what the FBI is supposed to be doing)

the Biggest problem tends to be Facial pictures are NOT detailed enough. Not even good 2D, let alone 3D.
Then we come to storage. Old days, with Large TAPE storage and loading and comparing. They didnt know what to compare let alone How to store TXT with the pictures with details to ID the person.
It was very slow. You could have a person ID pictures about 10+ times faster. Even NOW, with no 3D pictures yet. and no format of taking Skin textures and then adding abit of TXT with the pictures to show Identifying marks, we Still have a ton of problems, Many of with come down to details. Like looking at the picture, and adding his height or that he has tattoo’s. and READING the TXT about who this person is. EVEN with our SS# being able to tell us allot about a persons work history, where they lived(I looked myself up and found 6 address’s from past history, back to my teens) and our credit history, AND where we last purchased things.
But little of that Data is SUPPOSEDLY, accessible to our police forces???
Allot of this comes down to MONEY. Everything they do to investigate things, seems to rely on 3rd party sources, Which CHARGE money.
How many states have their Own forensics, and not rely on 3rd parties to do the work? why pay $100,000+ per year if you dont need them? Easy answer is to have them work in the medical community as well and do the Blood and chemical testing the hospitals use.(go find out how much the hospital pays for this, ALLOT)

Add Your Comment

Your email address will not be published. Required fields are marked *

Have a Techdirt Account? Sign in now. Want one? Register here

Comment Options:

Make this the or (get credits or sign in to see balance) what's this?

What's this?

Techdirt community members with Techdirt Credits can spotlight a comment as either the "First Word" or "Last Word" on a particular comment thread. Credits can be purchased at the Techdirt Insider Shop »

Follow Techdirt

Techdirt Daily Newsletter

Techdirt Deals
Techdirt Insider Discord
The latest chatter on the Techdirt Insider Discord channel...