Clearview Says Section 230 Immunizes It From Vermont's Lawsuit Over Alleged Privacy Violations

from the assuming-the-judge-will-call-this-theory-'novel'-which-isn't-a-compl dept

Clearview is currently being sued by the attorney general of Vermont for violating the privacy rights of the state’s residents. As the AG’s office pointed out in its lawsuit, users of social media services agree to many things when signing up, but the use of their photos and personal information as fodder for facial recognition software sold to government agencies and a variety of private companies isn’t one of them.

[T]he term “publicly available” does not have any meaning in the manner used by Clearview, as even though a photograph is being displayed on a certain social media website, it is being displayed subject to all of the rights and agreements associated with the website, the law, and reasonable expectations. One of those expectations was not that someone would amass an enormous facial-recognition-fueled surveillance database, as the idea that this would be, permitted in the United States was, until recently, unthinkable.

Thus, when an individual uploads a photograph to Facebook for “public” viewing, they consent to a human being looking at the photograph on Facebook. They are not consenting to the mass collection of those photographs by an automated process that will then put those photographs into a facial recognition database. Such a use violates the terms under which the consumer uploaded the photograph, which the consumer reasonably expects will be enforced.

This is somewhat the same point multiple companies have made with their (ultimately ineffective) cease-and-desist orders: we have not agreed to allow Clearview to harvest data from our sites and sell that collected data to others.

Whether or not selling this scraped collection to law enforcement agencies is unlawful in Vermont remains to be seen. But Clearview is fighting back in court, raising a truly questionable Section 230 defense against the AG’s lawsuit.

Clearview is represented by Tor Ekeland, who has been truly useful in defending people against bogus prosecutions. But Ekeland appears to believe Section 230 is a net loss for the public, so it’s interesting to see him raise it as a defense here.

Clearview’s motion to dismiss [PDF] compares Clearview to Google, claiming its bots crawl the web and cache images (and other data) on servers. However, Clearview claims it collects “far less data” than comparable search engines. According to its filing, Clearview does not collect any identifying info either — at least not intentionally. It only harvests photos and their metadata. The company says only 10% of the photos in its 4-billion photo database have any metadata attached.

However, this doesn’t mean the software can’t compile a staggering amount of information on a person and return this long list in response to an uploaded facial photo. To comply with California data privacy laws, Clearview has given state residents the opportunity to see what Clearview has gathered on them. It’s a lot.

The depth and variety of data that Clearview has gathered on me is staggering. My profile contains, for example, a story published about me in my alma mater’s alumni magazine from 2012, and a follow-up article published a year later.

It also includes a profile page from a Python coders’ meetup group that I had forgotten I belonged to, as well as a wide variety of posts from a personal blog my wife and I started just after getting married.

The profile contains the URL of my Facebook page, as well as the names of several people with connections to me, including my faculty advisor and a family member…

Clearview’s assertions about the personal information it intentionally gathers are meant to head off the Vermont AG’s claims that the company is violating state privacy laws. It’s also meant to portray Clearview as no more damaging to privacy than search engines like Google and no more nefarious than a Google search. The problem is cops are less likely to trust a Google search and more likely to trust a company that says it has 4 billion images and 600 law enforcement “partners,” even if the search results are equally questionable.

But on to the Section 230 argument, which is kind of amazing in its audacity.

Clearview is entitled to immunity under the CDA because: (1) Defendant is an interactive computer service provider or user; (2) Plaintiff’s claims are based on “information provided by another information content provider;” and (3) Plaintiff’s claims would treat Defendant as the “publisher or speaker” of such information.

These are the base claims. There’s more to it. But this is a company raising a defense afforded to service providers who host third party content. Here, there is no third party content — at least not in the sense that we’re used to. The “third parties” Clearview deals with are government agencies, who contribute no content of their own and only search the database of scraped photos using uploaded images.

In addition, the Vermont AG is not seeking an injunction against Clearview because of any particular content in its database. For example, the lawsuit is not predicated on defamatory content a user created. Instead, it’s suing Clearview because its method of database compilation ignores the state’s privacy laws. It’s hard to imagine how Section 230 fits this particular action, but this filing attempts to do exactly that.

First, Clearview asserts it’s a search engine just like Google. And if Google can’t be sued for violating privacy rights of users of other sites whose personal photos/information show up in Google searches, neither can Clearview.

The Attorney General seeks to prohibit Clearview from accessing or using publicly distributed photos, none of which Clearview AI created. Clearview AI’s republication of third-party content is the result of its search engine algorithm, which in this instance happens to be a biometric facial algorithm. The underlying technology does not transform Clearview into an information content provider that would be ineligible for CDA immunity. The CDA protects the publication of search engine results.


Clearview’s publication of its biometric facial algorithms results does not make it an information content provider any more than Google becomes one when it publishes its search algorithm results. Simply put, “[i]f a website displays content that is created entirely by third parties, … [it] is immune from claims predicated on that content.”

If Clearview did not create the content, it cannot be held responsible for its use of it — even if end users never specifically agreed to be part of a database accessible by law enforcement.

The motion also says Clearview cannot be viewed as a publisher, since all it has done is created (with scraped content) a searchable database of third-party content.

Vermont’s complaint cannot change the fact that it is targeting Clearview for performing the exact same functions as corporations like Google and Microsoft. Vermont claims that, “at a minimum, Clearview ‘must obtain the other party’s consent before’ using consumers’ photos from any website.” Google, in contrast, is said to “respect the Terms of Service of the websites they visit.” But Google searches are filled with information that individuals wanted to remain private, such as nonconsensually distributed intimate images. Nevertheless, Google has repeatedly been protected by §230 because courts have correctly viewed it as a publisher.


Clearview’s use of a complex algorithm does not negate the fact that it is performing the traditional role of a publisher. The Second Circuit emphatically rejected a claim that Facebook’s “matching” algorithm deprecated its status as a publisher. The Second Circuit has stated that, “we find no basis in the ordinary meaning of ‘publisher,’ the other text of Section 230, or decisions interpreting Section 230, for concluding that an interactive computer service is not the ‘publisher’ of third-party information when it uses tools such as algorithms that are designed to match that information with a consumer’s interests.”

The state has responded [PDF] to Clearview’s Section 230 assertions and has made the obvious point: this legal action isn’t being brought over content generated by third parties that would normally be met with immunity arguments. It’s being brought over Clearview’s acquisition and use of third-party content that state residents never agreed to being harvested/used by the facial recognition tech company.

The injurious information that Clearview claims gives it Section 230 immunity are the photographs that it screen-scraped. But the photographs themselves, as they were posted on the internet by their owners, are not injurious, and do not give rise to any of the State’s claims. Put another way, the State’s cause of action is not properly against millions of individuals who posted anodyne photographs on the internet.

The State’s claims are not based on the specific information at issue. The photographs themselves do not give rise to any of these claims. The State’s claims are for unfairness, deception, and fraudulent acquisition of data. Compl. ¶¶ 76-86. Specifically, Clearview’s conduct in acquiring the photographs through fraudulent means (see Section VI infra, discussing use of term “fraudulent”), storing them without proper security, applying facial recognition technology in a manner meant to violate privacy and infringe civil rights, and providing access to the database to whomever wanted it without concern for the safety or rights of the public, give rise to the State’s claims.

The state’s counter-argument hinges on a close reading of Section 230 — one that turns on a certain voluntary action by third parties.

Section 230 requires that the information at issue be provided by the third-party content provider. Again, the common thread in Section 230 cases is that the Information Content Provider posted the offensive information on the defendant’s servers. Here, no Vermont consumer could have intentionally provided any photographs to Clearview’s servers, because prior to the discovery in January of this year that Clearview had 3 billion photographs in a New York Times exposé, the general public did not know that Clearview existed.

It also points out how Clearview differs from the search engines it tries to compare itself to favorably.

Clearview is not a search engine like Google or Bing. Clearview’s App does something that no other company operating in the United States, including search engine companies, has ever done. In fact, search engine companies that are capable of creating a product like Clearview’s refused to do so for ethical reasons.

The state isn’t impressed by Clearview’s arguments and sums everything up with this:

For the fact pattern to apply, the photographs themselves would have to somehow be unfair or deceptive and the State’s claims would more properly be brought against the individuals.

In essence, the lawsuit isn’t about objectionable content hosted by Clearview, but objectionable actions by Clearview itself. That’s why Section 230 doesn’t apply. I’m not sure how the local court will read this, but it would seem readily apparent that Section 230 does not immunize Clearview in this case.

Filed Under: , , , ,
Companies: clearview, clearview ai

Rate this comment as insightful
Rate this comment as funny
You have rated this comment as insightful
You have rated this comment as funny
Flag this comment as abusive/trolling/spam
You have flagged this comment
The first word has already been claimed
The last word has already been claimed
Insightful Lightbulb icon Funny Laughing icon Abusive/trolling/spam Flag icon Insightful badge Lightbulb icon Funny badge Laughing icon Comments icon

Comments on “Clearview Says Section 230 Immunizes It From Vermont's Lawsuit Over Alleged Privacy Violations”

Subscribe: RSS Leave a comment
This comment has been deemed insightful by the community.
That One Guy (profile) says:

Re: Re:

That’s the funny thing about 230, it seems to have this strange effect on people wherein it causes a decent percentage of them to wildly hallucinate what is and is not in it.

Usually this impacts those arguing against it but apparently it can also impact those attempting to use it as a legal defense.

Anonymous Coward says:

How is this not copyright a violation for every picture?

If you or I tried to copy every bit of data and pictures we could scrounge from the internet, we would be sued by the holders of the copyright, and rightfully so. Unless Clearview has a clear transfer of copyright for each and every picture, they are going to be sued out of existence. I know several lawsuits involved tiny images generated from larger ones but these are literally copies of work that other people own the rights too. They never authorized this company to receive a copy or gave them the right to profit from them.

Chris-Mouse (profile) says:

Re: How is this not copyright a violation for every picture?

That depends on what they do with it.
If Clearview scrapes the pictures and then only feeds them into the facial recognition AI, then that would be a transformative use, similar to the Google Books usage that was found to be legal. But if Clearview included a copy of my picture in the report they send to a client, then that would be different. Would it be infringement? I suspect not, but only a court could decide.

If you feel like taking them to court, good luck proving that your picture was not only scraped, but also sent onward to a Clearview client.

Anonymous Coward says:

One of those expectations was not that someone would amass an enormous facial-recognition-fueled surveillance database…

Here are some more un-expectations:

  • that the photo would be pinned to a dart board and used as a target by a vengeful ex-
  • that the photo would be silk-screened on to a flag.
  • that the photo would be used as the basis for a commemorative gold coin.
  • that the photo would be used by law enforcement to determine if you were breaking the law.
  • that the photo would be used by national intelligence (rather than a private company) to build up a facial recognition database.

I’m pretty sure "I didn’t intend my photo to be used that way" isn’t a very good legal argument.

Anonymous Coward says:

I find the premise of Clearview as deplorable but its practice unavoidable. You’re not going to stop unethical scrapers from building some kind of face=trained model from recognizing other face son the web, they just might have a slightly smaller dataset to scrape. The biggest complaint I have is with Clearview is that it’s not suitable for police as a basis for identification. It’s just too inaccurate. It’s not much better than running a face through Yandex (Google’s algorithim does not match similar looking faces, I think this is on purpose).

Anonymous Coward says:

VALID use for Clearview

So with all the rampant police abuse being captured at the public protests (where the people show up to protest and the police show up to RIOT), we have a need to identify these ‘gems of society’ to publicly call them out on their BS.

What better use of a unproven facial recognition system than to look for all the ‘bad apples’ in the bunches, and since we know just how reliable Clearview claims to be, we can just assume that any officer identified as a bad apple, must really be a bad apple, I mean there’s no way unproven snake oil like Clearview could be wrong, am I right?

I mean if it’s okay for the police to use on us, it should be fine for us to use it on the Police in the current situation. Someone could create a huge public database of all the bad apples identified during the police riots (what is really happening right now) and publicly post and present that information to the world (not just the US, everyone should know about these people).

And if they don’t like being targeted in this manner, then perhaps the technology is really not ready for widespread use and shouldn’t be used to support even larger police fishing expeditions…

Add Your Comment

Your email address will not be published.

Have a Techdirt Account? Sign in now. Want one? Register here

Comment Options:

Make this the or (get credits or sign in to see balance) what's this?

What's this?

Techdirt community members with Techdirt Credits can spotlight a comment as either the "First Word" or "Last Word" on a particular comment thread. Credits can be purchased at the Techdirt Insider Shop »

Follow Techdirt

Techdirt Daily Newsletter

Techdirt Deals
Techdirt Insider Discord
The latest chatter on the Techdirt Insider Discord channel...