Clearview Sued By Vermont Attorney General For Violating The State's Privacy Laws
from the engineering-its-own-downfall dept
For the second time in a little over 30 days, odious facial recognition tech supplier Clearview is being sued. Unlike the first lawsuit, which is a proposed class action over violations of Illinois’ biometric privacy law, this one [PDF] is being filed by a government agency. The Attorney General of Vermont is seeking to permanently ban Clearview from collecting info about state residents or sell access to the info it’s already collected.
The complaint alleges that Clearview AI has been collecting photos of Vermonters available on the internet – called “screen scraping” – and using artificial intelligence to “map” individual’s faces. Private businesses, individuals, and law enforcement use this data via an app that allows a user to identify a person within seconds using only a photograph. No Vermont state or local law enforcement agencies have used the app. Clearview AI collects the facial recognition data of Vermont children, as well as adults, without their notice or consent.
The complaint also alleges that Clearview AI violated Vermont’s new Data Broker Law by fraudulently acquiring data through its use of screen scraping.
The method used to compile Clearview’s database of three billion images is going to make it very difficult for the company to dodge lawsuits like these. Scraping sites may be mostly legal, but Clearview doesn’t provide any way to opt out of this collection. Scraping images from sites where users have granted the platform permission to publish their information doesn’t mean the TOS agreement the user made with the platform magically migrates with the scraped data.
From the lawsuit:
[T]he term “publicly available” does not have any meaning in the manner used by Clearview, as even though a photograph is being displayed on a certain social media website, it is being displayed subject to all of the rights and agreements associated with the website, the law, and reasonable expectations. One of those expectations was not that someone would amass an enormous facial-recognition-fueled surveillance database, as the idea that this would be, permitted in the United States was, until recently, unthinkable.
Thus, when an individual uploads a photograph to Facebook for “public” viewing, they consent to a human being looking at the photograph on Facebook. They are not consenting to the mass collection of those photographs by an automated process that will then put those photographs into a facial recognition database. Such a use violates the terms under which the consumer uploaded the photograph, which the consumer reasonably expects will be enforced.
Clearview’s database compilation relies on its violating website terms of service en masse. Sites have begun pushing back but at this point their legal options are limited to being angry about it. But some state-by-state litigation may actually have an effect, since it seems almost impossible for Clearview to comply with privacy laws governing the collection of biometric data. Selling access to the data makes it even more difficult, since there’s no way for those packaged up in Clearview’s database to request removal or limit access to the scraped data.
It’s too late for Clearview to salvage its reputation by claiming it controls access carefully and sells only to law enforcement agencies to assist in solving crimes. That cat walked right out of the bag and over to BuzzFeed, which reported Clearview is selling its tech to private companies and governments engaged in ongoing human rights abuses. Claiming to serve a higher purpose while behaving like a bottom feeder doesn’t do anything for anyone, least of all a company that hoped it would fly under the radar long enough to capitalize fully on a bunch of free photos it found scattered around the internet.