In Response To Getting Sued, Clearview Is Dumping All Of Its Private Customers
from the thanks-for-the-half-assed-largesse,-asses dept
This is the first good news we’ve heard from Clearview since its exposure by Kashmir Hill for the New York Times back in January. In response to a lawsuit filed against it in Illinois accusing it of breaking that state’s privacy laws with its scraping of images and personal info from a multitude of social media platforms, Clearview has announced it’s cutting off some of its revenue stream.
Clearview AI — the controversial face-tracking company known for scraping more than 3 billion photos from social media sites including Facebook and Twitter — said it is ending its relationships with non–law enforcement entities and private companies amid regulatory scrutiny and several potential class action lawsuits.
Responding to one of those lawsuits, Clearview claimed in legal documents filed in an Illinois federal court on Wednesday that it was taking those voluntary actions and would “avoid transacting with non-governmental customers anywhere.”
Clearview’s motion to dismiss [PDF] argues this lawsuit is now mooted since it will no longer be doing business with any Illinois entities and has apparently found some way to sort out personal info on Illinoians from the rest of scraped stash. Take a couple of grains of salt before reading the following:
Moreover, Clearview has taken steps to secure and implement limits regarding the retention of any Illinois photos. Clearview is terminating access rights to its app for all account holders based in Illinois and is terminating the accounts of any non-law enforcement or government entity.
This is better but it’s still nothing great. Governments can still use a privacy law-violating app. That’s still a big problem. While private companies may be using Clearview’s app the most, it’s the company’s government customers that are more of a problem. Unproven tech may get you booted from the local mall for resembling a shoplifting suspect, but law enforcement agencies can ruin your life and take away several of your freedoms. And Clearview isn’t exactly selective about who it sells to, so plenty of overtly abusive governments will still get to use untested facial recognition software to destroy lives without worrying about niceties like due process.
Presumably, this will end the use of Clearview’s AI by Illinois government agencies. But reporters at BuzzFeed couldn’t get any confirmation from the Chicago PD that its plug had been pulled.
While the Chicago Police Department did not return a request for comment, a spokesperson for the Illinois secretary of state confirmed the office had been using Clearview “for about six months” to “assist other law enforcement agencies.”
Law enforcement agencies make up most of Clearview’s customers in the state. If Clearview’s sworn statements are true, they should all find their accounts deleted in the very near future. But Clearview hasn’t been too honest when dealing with judgments passed by the court of public opinion, so it’s probably wise to hold our golf applause for this lawsuit-prompted pullout until more facts are in.
We also shouldn’t read too much into its declaration that it will be restricting gathering data on Illinois residents. Clearview’s one-state filtering system apparently relies on metadata — a lot of which is stripped by sites when users upload photos. It also will avoid scraping data and images from sites with words like “Chicago” or “Illinois” in the URL, which will leave a lot of Illinois-based websites open for Clearview’s scraping business, no matter what it may be asserting in court. Sure, this is Clearview “taking steps” to prevent violating state law, but these two specifics can hardly be considered ambulatory.
The better news is this:
In addition, the startup said it was implementing an “opt-out mechanism” to allow people to exclude photos from its database.
Again, no congratulations until we see this implemented, and only if there’s some outside auditing done to ensure Clearview is actually doing the things it says it is. For that matter, the AI still hasn’t been independently examined, so we don’t even know what sort of false positive rate Clearview’s four-billion-image database is capable of generating.
Clearview is still as sketchy as ever and only slightly less dangerous following this move. As long as it still sells its scraped database and unproven AI to governments, it’s still a threat to millions of people.