Clearview Celebrates 10 Billion Scraped Images Collected, Claims It Can Now Recognize Blurred, Masked Faces

from the getting-bigger-and-getting-worse dept

Clearview's not going to let several months of bad press derail its plans to generate even more negative press. The facial recognition tech company that relies on billions of scraped images from the web to create its product is currently being sued in multiple states, has had its claims about investigative effectiveness repeatedly debunked and, most recently, served (then rescinded) a subpoena to transparency advocacy group Open the Government demanding information on all its Clearview-related FOIA requests as well as its communications with journalists.

I don't know what Clearview is doing now. Maybe it thinks it can still win hearts and minds by not only continuing to exist but also by getting progressively worse in terms of integrity and corporate responsibility. Whatever it is that Clearview's doing to salvage its reputation looks to be, at best, counterproductive. I mean, the only way Clearview could get worse is by getting bigger, which is exactly what it's done, according to this report by Will Knight for Wired.

The company’s cofounder and CEO, Hoan Ton-That, tells WIRED that Clearview has now collected more than 10 billion images from across the web—more than three times as many as has been previously reported.

Ton-That says the larger pool of photos means users, most often law enforcement, are more likely to find a match when searching for someone. He also claims the larger data set makes the company’s tool more accurate.

That's one way of looking at it. Another way of looking at it -- and by "it," I mean Clearview's unaudited, untested facial recognition AI -- is that adding more hay increases the odds of someone grabbing some hay and thinking it's actually a needle.

Yet another way of looking at this is that Clearview's mass scraping of every bit of publicly accessible web data it can may be legal, but it certainly isn't morally acceptable. While people do largely understand that their public posts to sites can be accessed by nearly anyone, they certainly don't expect someone to collect their photos and data in bulk, package it up, and sell it to government agencies. And, in some states, this sort of activity may actually be illegal, hence the lawsuits being brought by government officials.

On top of the 10 billion images Clearview swears it will only sell to responsible adult government employees, the company is now claiming it can do some real CSI-type stuff with its tech.

Ton-That says it is developing new ways for police to find a person, including “deblur” and “mask removal” tools. The first takes a blurred image and sharpens it using machine learning to envision what a clearer picture would look like; the second tries to envision the covered part of a person’s face using machine learning models that fill in missing details of an image using a best guess based on statistical patterns found in other images.

If you feel selling government agencies a more efficient way to generate false positives and false negatives is the way to future profitability, this would be the route to take. Without a doubt, tech advances will eventually make this more accurate, but rolling out unproven machine learning on top of unproven AI is only going to compound errors. Then there's the bias problem, which has been a problem for all facial recognition software. That includes those that have been independently tested and examined by the National Institute of Standards and Technology (NIST). Notably, Clearview has yet to subject its AI to outside testing.

Finally, there's this statement from Clearview CEO Ton-That:

The company says it is not currently pitching the technology outside of the US or to private industry. “We're focusing on the United States, because we want to get it right here,” Ton-That says. “We never want this to be abused in any way.”

Whether or not this statement about its current potential customer list is true remains to be seen. Clearview has already pitched its product to private companies and foreign governments. And it appears to have exited one foreign market solely because its product was declared illegal following a government investigation.

And claiming that the company does not want its product "abused in any way" directly contradicts the stuff it says to entities it wants to sell its product to. Emails from the company's marketing staff encouraged potential law enforcement customers (as well as the occasional billionaire) to "experiment" with the software by running searches on friends, family members, and others who never consented to be part of multiple Clearview test drives.

Is Clearview the worst AI product out there? In terms of accuracy, who knows? It hasn't been independently reviewed. In terms of everything else, there's really nothing out there that competes with it. The company's nonchalant conversion of the open web into a surveillance tool sets it apart from the competition. Its latest "advances" aren't going to do anything to rehabilitate its reputation.

Hide this

Thank you for reading this Techdirt post. With so many things competing for everyone’s attention these days, we really appreciate you giving us your time. We work hard every day to put quality content out there for our community.

Techdirt is one of the few remaining truly independent media outlets. We do not have a giant corporation behind us, and we rely heavily on our community to support us, in an age when advertisers are increasingly uninterested in sponsoring small, independent sites — especially a site like ours that is unwilling to pull punches in its reporting and analysis.

While other websites have resorted to paywalls, registration requirements, and increasingly annoying/intrusive advertising, we have always kept Techdirt open and available to anyone. But in order to continue doing so, we need your support. We offer a variety of ways for our readers to support us, from direct donations to special subscriptions and cool merchandise — and every little bit helps. Thank you.

–The Techdirt Team

Filed Under: facial recognition, surveillance
Companies: clearview


Reader Comments

Subscribe: RSS

View by: Time | Thread


  1. identicon
    Anonymous Coward, 16 Oct 2021 @ 7:34am

    Best I can really hope for is in 3-5 years I'll get a check for around $10 as a result of a class action.


Add Your Comment

Have a Techdirt Account? Sign in now. Want one? Register here



Subscribe to the Techdirt Daily newsletter




Comment Options:

  • Use markdown. Use plain text.
  • Make this the First Word or Last Word. No thanks. (get credits or sign in to see balance)    
  • Remember name/email/url (set a cookie)

Follow Techdirt
Essential Reading
Techdirt Deals
Report this ad  |  Hide Techdirt ads
Techdirt Insider Discord

The latest chatter on the Techdirt Insider Discord channel...

Loading...
Recent Stories

This site, like most other sites on the web, uses cookies. For more information, see our privacy policy. Got it
Close

Email This

This feature is only available to registered users. Register or sign in to use it.