Google Says Clearview's Site Scraping Is Wrong; Clearview Reminds Google It Scrapes Sites All The Time
from the twospidermans.jpg dept
Clearview’s business model has resulted in some mutual finger pointing. The most infamous of facial recognition tech companies outsources its database development. Rather than seeking input from interested parties, it scrapes sites for pictures of faces and whatever personal info accompanies them. The scraped info forms the contents of its facial recognition database, putting law enforcement only a few app clicks away from accessing over 3 billion images.
The companies being scraped have claimed this is a violation of their terms of service, if not actually illegal. It’s not clear that it’s actually illegal, even if it does violate the restrictions placed on users of these services. Twitter has already sent a cease-and-desist to Clearview, but it will probably take a court to make this stick. Unfortunately, Clearview’s actions could lead to some damaging precedent if Twitter forces the issue. Given the number of sites affected by Clearview’s scraping efforts, it’s probably only a matter of time before this gets litigious.
But the finger pointed by Google at Clearview hasn’t obtained the reaction Google may have hoped for. As CBS News reports, Clearview has returned fire by comparing its business model to Google’s business model.
Google and YouTube have sent a cease-and-desist letter to Clearview AI, a facial recognition app that scrapes images from websites and social media platforms, CBS News has learned.
[Clearview CEOP Hoan] Ton-That also argued that Clearview AI is essentially a search engine for faces. “Google can pull in information from all different websites,” he said. “So if it’s public and it’s out there and could be inside Google search engine, it can be inside ours as well.”
He’s not wrong. Google’s bots crawl the internet non-stop, building a database for its search engine. But there is one key difference: website owners can opt out of Google’s indexing.
“Most websites want to be included in Google Search, and we give webmasters control over what information from their site is included in our search results, including the option to opt-out entirely. Clearview secretly collected image data of individuals without their consent, and in violation of rules explicitly forbidding them from doing so,” [YouTube spokesperson Alex Thomas] said in the statement to CBS News.
There’s no way to opt out of Clearview’s “service,” other than just not existing on the internet. Ton-That is correct in assuming there’s very little legal exposure in scraping publicly-available images from the net, but these statements don’t make him or his company any more sympathetic. Ton-That is serving up untested AI to as many law enforcement agencies as possible, encouraging them to test drive the app using faces of friends and family even as the company states the software should only be used for approved law enforcement purposes.
It also claims an accuracy rate of 99.6% for searches, but that number hasn’t been rigorously tested. What appears to be happening is a mass rollout of untested AI to law enforcement agencies via demo/trial accounts. Clearview claims to be working with over 600 law enforcement agencies but very few agencies have stated publicly they’ve used Clearview to perform investigations.
Clearview’s packaging of public information into a law enforcement app is unpleasant, but likely legal. The same thing goes on behind the scenes of multiple data aggregators that sell info and analytics directly to government agencies. The main difference here is Clearview hasn’t been shy about its desire to pitch a cheap app/database to law enforcement even as its product remains unproven and untested. And it puts cops a lot closer to their dystopian dream of being able to demand identification from anyone they run into on the streets.