Facebook Pays $550 Million Settlement In Illinois Facial Recognition Lawsuit, Which Could Pose Problems For Clearview

from the upstart-may-be-able-to-burn-the-industry-to-the-ground dept

Late last week, legally and ethically-dubious facial recognition tech developer Clearview was sued for violating an Illinois law making certain collection and storage of biometric information illegal. I was very dismissive of the lawsuit, stating that scraping of publicly-posted photos couldn't possibly create an actionable violation of privacy.

My assessment may be wrong. A recent settlement by Facebook in a lawsuit alleging violations of this law suggests the proposed class action against Clearview might actually go somewhere, even if Clearview is an outside party scraping (possibly illegal) photo collections from other sites.

In what lawyers are calling the largest settlement ever reached in a privacy related lawsuit, Facebook will pay $550 million to settle claims it harvested users’ facial data without consent and in violation of a 2008 Illinois privacy law.

The announcement comes eight days after the Supreme Court denied Facebook’s petition to review the case and halt an impending trial that was put on hold in 2018 when Facebook appealed a lower court decision advancing the case.

This lawsuit dealt with yet another Facebook feature no one asked for: the nearly-automatic tagging of friends and acquaintances in uploaded photos. Facebook's AI scanned uploaded photos for matches and suggested names of people who resembled those in the photograph. Given the sheer amount of uploaded photos hosted by the site, it was speculated Facebook could have faced a $35 billion fine. If true, the $550 million settlement is a bargain.

But let's take another look at the Clearview lawsuit. Once you get past some of the more ridiculous assertions, you're looking at a few fairly plausible allegations that could survive a motion to dismiss and open Clearview up for what is certain to be some damaging discovery. The company is already cruising around on the outer edges of the law by scraping sites for photos in violation of their terms of service.

The law requires companies doing business in Illinois to obtain permission from users before harvesting biometric info. Since Clearview scrapes sites to build its biometric database, it is obviously not securing anyone's explicit permission.

And Clearview does business in Illinois. It claims it has "partnered" with 600-1,000 law enforcement agencies -- claims that should be taken with several doses of salt considering the number of law enforcement rebuttals that have greeted its marketing assertions.

But this is verifiable.

Interim Chicago Police Superintendent Charlie Beck on Thursday offered a vigorous defense of CPD’s use of a facial recognition tool that matches images of unknown suspects to three billion photos scraped from social media.

Clearview AI, the Manhattan-based firm that developed the software, has come under fire after a lawsuit was filed in federal court in Chicago earlier this month seeking to halt the company’s data collection and after a New York Times report detailed the privacy concerns its technology has brought to the fore.

But, Beck said Thursday the department needs the tool and doesn’t abuse it. Without it, Chicago Police would “solve fewer crimes than places” that do use it, he said.

Ok. Seems unlikely an unproven tool would be better at solving crimes than stuff the CPD already does, but who wouldn't prefer to use an app rather than shoe leather to track down perps?

That satisfies the "doing business in" prong. But if we're looking at unauthorized collection of biometric info, it gets a little cloudier. If Facebook's application of facial recognition AI is the key factor for its violation of the state law, Clearview's AI creates the same legal risk for the company. If the sites it's scraping from aren't applying facial recognition AI to the photos, Clearview's application of its AI creates a violation that didn't previously exist when it was simply collecting the photos for its database.

Photos scraped from Facebook don't immediately become contraband, despite Facebook's application of AI to uploaded photos. The photos are inert, so to speak, when they're scraped by Clearview. It's the application of additional software to convert them into biometric information that causes the violation and triggers the rapidly-escalating fines.

Even if its scraping ultimately proves to be legal, its application of AI makes deploying this in Illinois very risky. The bizarre twist is that it might be Clearview's decision to sell only to law enforcement agencies that might save it from being successfully sued. There's a carveout in the law for government agencies and contractors, which might mean the state law does not apply to Clearview.

Nothing in this Act shall be construed to apply to a contractor, subcontractor, or agent of a State agency or local unit of government when working for that State agency or local unit of government.

Whew. So, ask the government for some grievance redress and get told another part of the government already let the third branch of the government do whatever it wants with otherwise-illegal biometric collections.

I'm still shorting this lawsuit, but not nearly as aggressively as I did earlier. There's too much unknown about Clearview's data-gathering at the moment. But it appears Clearview can violate the law with impunity as long as it only sells its services to Illinois government agencies.

Hide this

Thank you for reading this Techdirt post. With so many things competing for everyone’s attention these days, we really appreciate you giving us your time. We work hard every day to put quality content out there for our community.

Techdirt is one of the few remaining truly independent media outlets. We do not have a giant corporation behind us, and we rely heavily on our community to support us, in an age when advertisers are increasingly uninterested in sponsoring small, independent sites — especially a site like ours that is unwilling to pull punches in its reporting and analysis.

While other websites have resorted to paywalls, registration requirements, and increasingly annoying/intrusive advertising, we have always kept Techdirt open and available to anyone. But in order to continue doing so, we need your support. We offer a variety of ways for our readers to support us, from direct donations to special subscriptions and cool merchandise — and every little bit helps. Thank you.

–The Techdirt Team

Filed Under: biometrics, illinois, privacy, privacy laws
Companies: clearview, clearview ai, facebook


Reader Comments

Subscribe: RSS

View by: Time | Thread


  • identicon
    Anonymous Coward, 3 Feb 2020 @ 12:42pm

    Nothing in this Act shall be construed to apply to a contractor, subcontractor, or agent of a State agency or local unit of government when working for that State agency or local unit of government.

    Lawyers should have some fun with this line. I can see one side arguing that this phrasing only covers entities (whether human or corporate) executing on a contract, and incidentally violating the Act in the course of their work on the contract. They would then argue that the accused is doing the violation on its own initiative, and happens to also be a vendor selling the results of its violations to the State agencies. Under that view, the accused is arguably not protected by this exemption.

    I can also see the side clearly suggested by Techdirt's commentary, that being a vendor selling these services to the State qualifies the vendor as a contractor "working for that State agency," and thus shields the vendor from the Act.

    reply to this | link to this | view in chronology ]

    • identicon
      Anonymous Coward, 3 Feb 2020 @ 12:56pm

      Re:

      That was my thought too. Since they started scraping and the like WELL before they started selling to LEOs, it would seem this would apply.

      reply to this | link to this | view in chronology ]

  • icon
    Nathan F (profile), 3 Feb 2020 @ 12:43pm

    But it appears Clearview can violate the law with impunity as long as it only sells its services to Illinois government agencies.

    Which,as we all know, have databases that are the most secure things in the world.

    /s

    reply to this | link to this | view in chronology ]

    • icon
      Uriel-238 (profile), 3 Feb 2020 @ 6:03pm

      Secure government department databases

      I was thinking this. That this business is only one moderately-skilled hacker away from becoming a humongous privacy mess.

      If a private company makes a silo full of military-grade bioagent which it sells to the US Army, but then the silo springs a leak and infects a few counties full of people, can that private company be sued?

      reply to this | link to this | view in chronology ]

  • identicon
    Anonymous Coward, 3 Feb 2020 @ 1:30pm

    What is Illinois's deal?

    That which is public is, by definition, not private. Anyone expecting any degree of privacy in things done in public is an idiot. The genie is simply not going back into the bottle, and people hyperventilating about the "privacy implications" of analyzing public data sound like modern-day buggy whip manufacturers.

    The world is changing. Adapt.

    reply to this | link to this | view in chronology ]

    • identicon
      Anonymous Coward, 3 Feb 2020 @ 2:14pm

      Re:

      Illinois' deal is that they can pass such a law.

      "Public" was never the same as "infinitely examinable by machines and can be analyzed and collated with other bulk data in a way even a large group of human beings never could".

      If privacy and sensibility advocates change things a bit, i guess it is someone else who will also need to adapt.

      Anyway, why be such a fucking prick about it?

      reply to this | link to this | view in chronology ]

      • identicon
        Anonymous Coward, 4 Feb 2020 @ 9:03am

        Re: Re:

        Illinois' deal is that they can pass such a law.

        Tim seems to think that they can't—not legally—but mainly talks about how it's dumb rather that giving any precedent by which it could be thrown out of court. I agree that they probably can pass and enforce it, whether or not it's a good idea.

        reply to this | link to this | view in chronology ]

  • identicon
    Anonymous Coward, 3 Feb 2020 @ 1:53pm

    I was very dismissive of the lawsuit, stating that scraping of publicly-posted photos couldn't possibly create an actionable violation of privacy.

    But what does that have to do with anything? The question is whether it viotates the Illinois law, not whether it violates some abstract notion of privacy. Pictures are biometric data, and the company did or didn't collect and/or store them.

    While it might be nice and logical if one could get a case dismissed on the basis that the alleged violation has nothing to do with the title or purpose of the law, or because the law is dumb, it doesn't work like that.

    reply to this | link to this | view in chronology ]

    • identicon
      Anonymous Coward, 3 Feb 2020 @ 2:10pm

      Re:

      It has everything to do with the previous post, and exactly why he posted a re-examination in light of said Illinois law today.

      reply to this | link to this | view in chronology ]

      • identicon
        Anonymous Coward, 3 Feb 2020 @ 8:59pm

        Re: Re:

        The previous post didn't justify this logic either. It quotes "According to BIPA, companies must obtain explicit consent from Illinois residents before collecting or using any of their biometric information -- such as the facial scans Clearview collected from people's social media photos." Then: "Whether it's possible to violate a privacy law by scraping public photos remains to be litigated"—OK, presumption of innocence and all, but the text leaves little ambiguity and I see no constitutional basis for throwing it out.

        The post goes on to say how it's all very stupid, but that's not the same as describing why the law is illegal. There are certainly examples of stupider laws being upheld, like when people are forced to register as sex offenders for having photos of themselves. Or when cops get immunity because there's no precedent for the exact outrageous thing they did (on the plus side, a court went the other way in the latest Techdirt story—and there's the famous quote "common sense revolts at the idea"—but that's the exception rather than the rule).

        reply to this | link to this | view in chronology ]

  • identicon
    Wyatt Derp, 3 Feb 2020 @ 2:39pm

    How can they call it Facial Recognition when it does not actually recognize faces?

    reply to this | link to this | view in chronology ]

  • icon
    GHB (profile), 3 Feb 2020 @ 5:26pm

    Typical Facebook

    And the history of privacy issues keeps on growing. If you have noticed, looking at this wikipedia article: https://en.wikipedia.org/wiki/Criticism_of_Facebook on the “contents” list, Privacy issues has the BIGGEST listings than all other subsections in the list.

    reply to this | link to this | view in chronology ]


Add Your Comment

Have a Techdirt Account? Sign in now. Want one? Register here



Subscribe to the Techdirt Daily newsletter




Comment Options:

  • Use markdown. Use plain text.
  • Remember name/email/url (set a cookie)

Close

Add A Reply

Have a Techdirt Account? Sign in now. Want one? Register here



Subscribe to the Techdirt Daily newsletter




Comment Options:

  • Use markdown. Use plain text.
  • Remember name/email/url (set a cookie)

Follow Techdirt
Special Affiliate Offer

Essential Reading
Techdirt Insider Chat
Recent Stories

This site, like most other sites on the web, uses cookies. For more information, see our privacy policy. Got it
Close

Email This

This feature is only available to registered users. Register or sign in to use it.