South Korean Gov't Gave Millions Of Facial Photos Collected At Airports To Private Companies


Facial recognition systems are becoming an expected feature in airports. Often installed under the assumption that collecting the biometric data of millions of non-terrorist travelers will prevent more terrorism, the systems are just becoming another bullet point on the list of travel inconveniences.

Rolled out by government agencies with minimal or no public input and deployed well ahead of privacy impact assessments, airports around the world are letting people know they can fly anywhere as long as they give up a bit of their freedom.

What’s not expected is that the millions of images gathered by hundreds of cameras will just be handed over to private tech companies by the government that collected them. That’s what happened in South Korea, where facial images (mostly of foreign nationals) were bundled up and given to private parties without ever informing travelers this had happened (or, indeed, would be happening).

The South Korean government handed over roughly 170 million photographs showing the faces of South Korean and foreign nationals to the private sector without their consent, ostensibly for the development of an artificial intelligence (AI) system to be used for screening people entering and leaving the country, it has been learned.

The agency carelessly handing out millions of facial images to private tech companies was the country’s Ministry of Justice. Ironically enough, South Korean privacy activists (as well as some of the millions contained in the database) say this action is exactly the opposite of “justice.”

While the use of facial recognition technology has become common for governments across the world, advocates in South Korea are calling the practice a “human rights disaster” that is relatively unprecedented.  

“It’s unheard-of for state organizations—whose duty it is to manage and control facial recognition technology—to hand over biometric information collected for public purposes to a private-sector company for the development of technology,” six civic groups said during a press conference last week.

The project — one with millions of unaware participants — began in 2019. The MOJ is in the process of obtaining better facial recognition tech to arm its hundreds of airport cameras with. To accomplish this, it apparently decided the private sector should take everything cameras had collected so far and use those images to train facial recognition AI.

The public was never informed of this by the Ministry of Justice. It took another government employee to deliver the shocking news. National Assembly member Park Joo-min requested information from the Ministry about its “Artificial Intelligence and Tracking System Construction” project and received this bombshell in return.

Maybe the government felt this was okay because most of the images were of non-citizens. This is from South Korean news agency Hankyoreh, which broke the story:

Of the facial data transferred from the MOJ for use by private companies last year as part of this project, around 120 million images were of foreign nationals.

Companies used 100 million of these for “AI learning” and another 20 million for “algorithm testing.” The MOJ possessed over 200 million photographs showing the faces of approximately 90 million foreign nationals as of 2018, meaning that over half of them were used for learning.

With two-thirds of the freebie images being of foreigners, perhaps the South Korean government thought it would lower its incoming litigation footprint. But that still leaves nearly 58 million images of its own citizens. And there’s nothing preventing foreign citizens from suing the South Korean government, even though this action can sometimes be considerably more expensive than suing locally.

Lawsuits are coming, though, according to Motherboard.

Shortly after the discovery, civil liberty groups announced plans to represent both foreign and domestic victims in a lawsuit.

The legal basis for the collection isn’t being challenged. It’s the distribution of the collected images, which no travelers expressly agreed to. Precedent isn’t on the government’s side.

“Internationally, it is difficult to find any precedent of actual immigration data from domestic and international travelers being provided to companies and used for AI development without any notification or consent,” said Chang Yeo-Kyung, executive director of the Institute for Digital Rights.

It’s pretty sad when democratic governments decide the people belong to the government, rather than the other way around. But as the march towards always-on surveillance continues in travel hubs and major cities, using members of the public as guinea pigs for AI development is probably going to become just as routine as the numerous, formerly-novel, impositions placed on travelers shortly after the 9/11 attacks.

Filed Under: , , , ,

Rate this comment as insightful
Rate this comment as funny
You have rated this comment as insightful
You have rated this comment as funny
Flag this comment as abusive/trolling/spam
You have flagged this comment
The first word has already been claimed
The last word has already been claimed
Insightful Lightbulb icon Funny Laughing icon Abusive/trolling/spam Flag icon Insightful badge Lightbulb icon Funny badge Laughing icon Comments icon

Comments on “South Korean Gov't Gave Millions Of Facial Photos Collected At Airports To Private Companies”

Subscribe: RSS Leave a comment

Add Your Comment

Your email address will not be published. Required fields are marked *

Have a Techdirt Account? Sign in now. Want one? Register here

Comment Options:

Make this the or (get credits or sign in to see balance) what's this?

What's this?

Techdirt community members with Techdirt Credits can spotlight a comment as either the "First Word" or "Last Word" on a particular comment thread. Credits can be purchased at the Techdirt Insider Shop »

Follow Techdirt

Techdirt Daily Newsletter

Techdirt Deals
Techdirt Insider Discord
The latest chatter on the Techdirt Insider Discord channel...