Clearview Celebrates 10 Billion Scraped Images Collected, Claims It Can Now Recognize Blurred, Masked Faces

from the getting-bigger-and-getting-worse dept

Clearview's not going to let several months of bad press derail its plans to generate even more negative press. The facial recognition tech company that relies on billions of scraped images from the web to create its product is currently being sued in multiple states, has had its claims about investigative effectiveness repeatedly debunked and, most recently, served (then rescinded) a subpoena to transparency advocacy group Open the Government demanding information on all its Clearview-related FOIA requests as well as its communications with journalists.

I don't know what Clearview is doing now. Maybe it thinks it can still win hearts and minds by not only continuing to exist but also by getting progressively worse in terms of integrity and corporate responsibility. Whatever it is that Clearview's doing to salvage its reputation looks to be, at best, counterproductive. I mean, the only way Clearview could get worse is by getting bigger, which is exactly what it's done, according to this report by Will Knight for Wired.

The company’s cofounder and CEO, Hoan Ton-That, tells WIRED that Clearview has now collected more than 10 billion images from across the web—more than three times as many as has been previously reported.

Ton-That says the larger pool of photos means users, most often law enforcement, are more likely to find a match when searching for someone. He also claims the larger data set makes the company’s tool more accurate.

That's one way of looking at it. Another way of looking at it -- and by "it," I mean Clearview's unaudited, untested facial recognition AI -- is that adding more hay increases the odds of someone grabbing some hay and thinking it's actually a needle.

Yet another way of looking at this is that Clearview's mass scraping of every bit of publicly accessible web data it can may be legal, but it certainly isn't morally acceptable. While people do largely understand that their public posts to sites can be accessed by nearly anyone, they certainly don't expect someone to collect their photos and data in bulk, package it up, and sell it to government agencies. And, in some states, this sort of activity may actually be illegal, hence the lawsuits being brought by government officials.

On top of the 10 billion images Clearview swears it will only sell to responsible adult government employees, the company is now claiming it can do some real CSI-type stuff with its tech.

Ton-That says it is developing new ways for police to find a person, including “deblur” and “mask removal” tools. The first takes a blurred image and sharpens it using machine learning to envision what a clearer picture would look like; the second tries to envision the covered part of a person’s face using machine learning models that fill in missing details of an image using a best guess based on statistical patterns found in other images.

If you feel selling government agencies a more efficient way to generate false positives and false negatives is the way to future profitability, this would be the route to take. Without a doubt, tech advances will eventually make this more accurate, but rolling out unproven machine learning on top of unproven AI is only going to compound errors. Then there's the bias problem, which has been a problem for all facial recognition software. That includes those that have been independently tested and examined by the National Institute of Standards and Technology (NIST). Notably, Clearview has yet to subject its AI to outside testing.

Finally, there's this statement from Clearview CEO Ton-That:

The company says it is not currently pitching the technology outside of the US or to private industry. “We're focusing on the United States, because we want to get it right here,” Ton-That says. “We never want this to be abused in any way.”

Whether or not this statement about its current potential customer list is true remains to be seen. Clearview has already pitched its product to private companies and foreign governments. And it appears to have exited one foreign market solely because its product was declared illegal following a government investigation.

And claiming that the company does not want its product "abused in any way" directly contradicts the stuff it says to entities it wants to sell its product to. Emails from the company's marketing staff encouraged potential law enforcement customers (as well as the occasional billionaire) to "experiment" with the software by running searches on friends, family members, and others who never consented to be part of multiple Clearview test drives.

Is Clearview the worst AI product out there? In terms of accuracy, who knows? It hasn't been independently reviewed. In terms of everything else, there's really nothing out there that competes with it. The company's nonchalant conversion of the open web into a surveillance tool sets it apart from the competition. Its latest "advances" aren't going to do anything to rehabilitate its reputation.

Hide this

Thank you for reading this Techdirt post. With so many things competing for everyone’s attention these days, we really appreciate you giving us your time. We work hard every day to put quality content out there for our community.

Techdirt is one of the few remaining truly independent media outlets. We do not have a giant corporation behind us, and we rely heavily on our community to support us, in an age when advertisers are increasingly uninterested in sponsoring small, independent sites — especially a site like ours that is unwilling to pull punches in its reporting and analysis.

While other websites have resorted to paywalls, registration requirements, and increasingly annoying/intrusive advertising, we have always kept Techdirt open and available to anyone. But in order to continue doing so, we need your support. We offer a variety of ways for our readers to support us, from direct donations to special subscriptions and cool merchandise — and every little bit helps. Thank you.

–The Techdirt Team

Filed Under: facial recognition, surveillance
Companies: clearview


Reader Comments

Subscribe: RSS

View by: Time | Thread


  • identicon
    Pixelation, 15 Oct 2021 @ 8:19am

    And now, Clearview knows Tim looks just like Darth Vader...

    reply to this | link to this | view in chronology ]

    • identicon
      Anonymous Coward, 15 Oct 2021 @ 1:32pm

      Re:

      So... they do know that completely different entities, say, Tim and... oh i don't know, Clearview? Can look exactly the same.

      reply to this | link to this | view in chronology ]

  • identicon
    Anonymous Coward, 15 Oct 2021 @ 8:44am

    So they think they've actually invented Zoom and Enhance?

    reply to this | link to this | view in chronology ]

    • icon
      TaboToka (profile), 15 Oct 2021 @ 9:49am

      Enhance

      Ton-That says it is developing new ways for police to find a person, including “deblur” and “mask removal” tools.

      Not invented, PERFECTED!!!!!111

      What a time to be alive. And by "time" I mean in an age of grifters and charlatans who purport to recreate information from nothing.

      reply to this | link to this | view in chronology ]

  • identicon
    Ian Williams, 15 Oct 2021 @ 9:05am

    Odd thought, could their scraping websites for photos be a copyright infringement? Photos do have copyright attached, and while Facebook et.al, have permissive licenses, they have not actually then sublicenced the scraped images to Clearview, who arguably then creates commercial derivative work from them, in the form of their face maps.

    reply to this | link to this | view in chronology ]

    • identicon
      Pixelation, 15 Oct 2021 @ 9:23am

      Re:

      Looks like Clearview will owe more money than there is in the known universe.

      reply to this | link to this | view in chronology ]

    • icon
      Scary Devil Monastery (profile), 18 Oct 2021 @ 2:58am

      Re:

      "Odd thought, could their scraping websites for photos be a copyright infringement?"

      It would be odd if it didn't. We'll know so when Liebowitz's peers in the copyright troll cave start taking action on behalf of clients real and imagined.

      reply to this | link to this | view in chronology ]

  • identicon
    Anonymous Coward, 15 Oct 2021 @ 11:00am

    Mask removal. That's just great. A guy robs a bank in a ski mask and they show the jury a picture with YOUR FACE on it. "machine learning models", "best guess", "statistical patterns"... welcome to the Big House.

    I just, of course. Courts might balk at such things, which is why they're obsolete. The bank will just drone your house.

    reply to this | link to this | view in chronology ]

  • icon
    That One Guy (profile), 15 Oct 2021 @ 2:38pm

    'Our house is already on fire, let's add some gasoline!'

    Someone really needs to learn to read the room...

    reply to this | link to this | view in chronology ]

  • icon
    That Anonymous Coward (profile), 15 Oct 2021 @ 5:44pm

    unkind joke about Chuck and floors at Arby's goes here

    reply to this | link to this | view in chronology ]

    • icon
      That Anonymous Coward (profile), 15 Oct 2021 @ 9:20pm

      Re:

      I forget people don't always have my frame of reference...
      the CEO of Clearview is besties with Charles C. Johnson, google him, google the claims about the befouling of a floor of an Arby's.

      reply to this | link to this | view in chronology ]

  • identicon
    scote, 15 Oct 2021 @ 6:54pm

    Machine learning makes s*** up.

    You can't use machine learning for legitimate forensic purposes.

    Machine doesn't recover detail, it doesn't unblur faces, it doesn't magically show the face beneath the mask, instead it invents plausible photorealistic detail. It's literally a computer program faking evidence in an utterly and dangerously convincing way, and almost certainly doing so in a very biased way.

    reply to this | link to this | view in chronology ]

    • icon
      Scary Devil Monastery (profile), 18 Oct 2021 @ 3:14am

      Re: Machine learning makes s*** up.

      "It's literally a computer program faking evidence in an utterly and dangerously convincing way, and almost certainly doing so in a very biased way."

      ...and this is why given sharp contrasts and shadows to work with in a controlled environment a computer's best guess can be close enough so as to identify a white person in good light but can't tell the difference between Prince and Oprah Winfrey in any kind of light - let alone Mr. John Doe of african-american descent from a mugshot of Bin Laden.

      Honestly, everywhere but the US Clearview would, by now, have been asked to produce credible evidence for their claim or get hit by serious sanctions for false advertising. But american law does love to incentivize the snake oil salesman brand of conmanship...

      reply to this | link to this | view in chronology ]

  • identicon
    Anonymous Coward, 16 Oct 2021 @ 7:34am

    Best I can really hope for is in 3-5 years I'll get a check for around $10 as a result of a class action.

    reply to this | link to this | view in chronology ]

  • identicon
    Anonymous Coward, 16 Oct 2021 @ 8:09am

    The ONLY thing clearview recognizes is planting fake evidence in their database because they want "black people" to pay for crimes even if there is no evidence other than "a gut feeling".

    Thats how they claim they can essentially "see through" masks magically...

    You can guarantee 100% of the time it'll be "the black guy did it..they're all criminals anyway"

    reply to this | link to this | view in chronology ]

  • identicon
    Anonymous Coward, 16 Oct 2021 @ 8:11am

    Apparently Clearview funds KKK chapters, proud boys and a load of neo-nazi organizations, or that its directors are rabidly racist, homophobic antisemites?

    They want a "recognition" database 1) for the money and 2) because they've discussed how they can imprison non-whites based entirely on their "evidence".

    reply to this | link to this | view in chronology ]

    • icon
      That One Guy (profile), 17 Oct 2021 @ 12:35pm

      Re:

      Gonna go with a [Citation Needed] for that one as that seems like something that would be getting a wee bit of attention given the scrutiny the company's under.

      reply to this | link to this | view in chronology ]

    • icon
      Scary Devil Monastery (profile), 18 Oct 2021 @ 4:30am

      Re:

      "They want a "recognition" database 1) for the money..."

      Full stop, Right there. There is money in peddling the snake oil of facial recognition as the next big brand line of forensic quackery. And the amount of money available grows in direct proportion to how little fact-checking of the method's accuracy there is.

      If clearview thought they could get away with it they'd make any pitch they thought could get any vested interest, from law enforcement to private kindergartens, to buy some clearview miracle medicine.

      reply to this | link to this | view in chronology ]


Add Your Comment

Have a Techdirt Account? Sign in now. Want one? Register here



Subscribe to the Techdirt Daily newsletter




Comment Options:

  • Use markdown. Use plain text.
  • Make this the First Word or Last Word. No thanks. (get credits or sign in to see balance)    
  • Remember name/email/url (set a cookie)

Close

Add A Reply

Have a Techdirt Account? Sign in now. Want one? Register here



Subscribe to the Techdirt Daily newsletter




Comment Options:

  • Use markdown. Use plain text.
  • Make this the First Word or Last Word. No thanks. (get credits or sign in to see balance)    
  • Remember name/email/url (set a cookie)

Follow Techdirt
Advertisment

Report this ad  |  Hide Techdirt ads
Essential Reading
Techdirt Deals
Report this ad  |  Hide Techdirt ads
Techdirt Insider Discord

The latest chatter on the Techdirt Insider Discord channel...

Loading...
Recent Stories

This site, like most other sites on the web, uses cookies. For more information, see our privacy policy. Got it
Close

Email This

This feature is only available to registered users. Register or sign in to use it.