Canadian Privacy Commission Says Clearview's App Is Illegal, Tells It To Pack Its Things And Leave
from the pics-and-GTFO dept
Clearview has screwed with the wrong people. The reprehensible facial recognition AI company that sells access to its database of scraped photos and personal info managed to raise the ire of some of the most restrained and polite people in the world, as Kashmir Hill reports for the New York Times.
The facial recognition app Clearview AI is not welcome in Canada and the company that developed it should delete Canadians’ faces from its database, the country’s privacy commissioner said on Wednesday.
“What Clearview does is mass surveillance, and it is illegal,” Commissioner Daniel Therrien said at a news conference. He forcefully denounced the company as putting all of society “continually in a police lineup.”
Clearview does appear to violate Canadian privacy laws, which require consent before using personal data. This was the impetus for a yearlong investigation of Clearview by Canadian privacy commissioners. The company claimed its offering was legal because it only utilized publicly available data scraped from dozens of social media sites. The commission disagreed.
“Information collected from public websites, such as social media or professional profiles, and then used for an unrelated purpose, does not fall under the ‘publicly available’ exception,” according to the report.
Clearview is going to court over this determination, saying it does nothing Google doesn’t do and yet Google is still allowed to operate in Canada. Fair point, I guess, but Google doesn’t appear to be selling government agencies access to billions of pieces of personal info for them to paw through at their leisure.
And Canadian law enforcement agencies are using Clearview to do just that. The commission noted that “thousands of searches” have been performed by dozens of agencies, including the Royal Canadian Mounted Police. Of course, this doesn’t appear to be making Clearview much money. Only one agency actually paid for access. The rest of the “thousands” of searches were performed using trial accounts — the ones Clearview encourages to “go wild” testing the AI by running it on pictures of friends, families, and anyone else they’ve got a picture of.
Clearview has already stopped selling access to the Canadian market, but that’s not going to fix things. Clearview can control who it provides access to but it’s going to have a much more difficult time determining who’s in its database. If the illegality finding holds up, Clearview will need to delete information pertaining to Canadian residents. Finding Canadians in a database with billions of data points is something Clearview’s AI can’t handle, especially since it’s much more difficult to determine whose information belongs to who when you’ve thrown it all into a big pile that’s only expected to deliver matches to uploaded photos.
In the meantime, Clearview is offering Canadians the chance to opt out. All they have to do to be removed from Clearview’s database is provide the company with personal information it may not have already collected. And then Canadians are asked to trust a company that’s acted extremely carelessly and obnoxiously to follow through with its end of the bargain, rather than just add this new information to its existing stash.
For now, Clearview is merely facing the anger of Canadian regulators. There’s really no legal force behind the commission’s damnation. Not at the moment. But it could change in the future and it would make sense for Clearview to walk away from the mostly untapped Canadian market before it generates laws and legal precedent that would act as a blueprint for bans/removals in other countries.