Documents Show IBM Pitched The NYPD Facial Recognition Software With Built-In Racial Profiling Options

from the facial-profiling-amirite dept

Documents obtained by The Intercept show the NYPD and IBM engaged in a long-running facial recognition tech partnership from 2008 to 2016. While some of this deployment was discussed publicly, details about the extent of the program — as well as it’s more problematic elements — haven’t been.

As the article’s title informs the reader, camera footage could be scanned for face matches using skin tone as a search constraint. Considering this was pushed by IBM as a tool to prevent the next 9/11, it’s easy to see why the NYPD — given its history of surveilling Muslim New Yorkers — might be willing to utilize a tool like this to pare down lists of suspects to just the people it suspected all along (Muslims).

There are a number of surprises contained in the long, detailed article, but the first thing that jumps out is IBM’s efforts and statements, rather than the NYPD’s. We all know the government capitalizes on tragedies to expand its power, but here we see a private corporation appealing to this base nature to make a sale.

In New York, the terrorist threat “was an easy selling point,” recalled Jonathan Connell, an IBM researcher who worked on the initial NYPD video analytics installation. “You say, ‘Look what the terrorists did before, they could come back, so you give us some money and we’ll put a camera there.”

From this pitch sprung an 8-year program — deployed in secrecy by the NYPD to gather as much footage as possible of New Yorkers for dual purposes: its own law enforcement needs and to serve as a testing ground for IBM’s new facial recognition tech. Needless to say, New Yorkers were never made aware of their lab rat status in IBM’s software development process.

Even though the software could search by skin tone (as well as by “head color,” age, gender, and facial hair), the NYPD claims it never used that feature in a live environment, despite IBM’s urging.

According to the NYPD, counterterrorism personnel accessed IBM’s bodily search feature capabilities only for evaluation purposes, and they were accessible only to a handful of counterterrorism personnel. “While tools that featured either racial or skin tone search capabilities were offered to the NYPD, they were explicitly declined by the NYPD,” Donald, the NYPD spokesperson, said. “Where such tools came with a test version of the product, the testers were instructed only to test other features (clothing, eyeglasses, etc.), but not to test or use the skin tone feature. That is not because there would have been anything illegal or even improper about testing or using these tools to search in the area of a crime for an image of a suspect that matched a description given by a victim or a witness. It was specifically to avoid even the suggestion or appearance of any kind of technological racial profiling.”

It’s easy to disbelieve this statement by the NYPD, given its long history of racial profiling, but it may be those handling the secret program deployment actually understood no program remains secret forever and sought to head off complaints and lawsuits by discouraging use of a controversial search feature. It also may be the NYPD was super-sensitive to these concerns following the partial dismantling of its stop-and-frisk program and the outing of its full-fledged, unconstitutional surveillance of local Muslims.

The thing is IBM is still selling this tech it beta tested live from New York. The same features the NYPD rejected are used to sell other law enforcement agencies on the power of its biometric profiling software.

In 2017, IBM released Intelligent Video Analytics 2.0, a product with a body camera surveillance capability that allows users to detect people captured on camera by “ethnicity” tags, such as “Asian,” “Black,” and “White.”

And there’s a counter-narrative that seems to dispute the NYPD’s assertions about controversial image tagging features. The IBM researcher who helped develop the skin tone recognition feature is on record stating the company doesn’t develop features unless there’s a market for them. In his estimation, the NYPD approached IBM to ask for this feature while the 8-year pilot program was still underway. The NYPD may have opted out after the feature went live, but it may have only done so to steer clear of future controversy. An ulterior motive doesn’t make it the wrong move, but it also shouldn’t be assumed the NYPD has morphed into heroic defenders of civil liberties and personal privacy.

What’s available to other law enforcement agencies not similarly concerned about future PR black eyes is “mass racial profiling” at their fingertips. IBM has built a product that appeals to law enforcement’s innate desire to automate police work, replacing officers on the street with cameras and software. Sure, there will be some cameras on patrol officers as well, but those are just for show. The real work of policing is done at desks using third-party software that explicitly allows — if not encourages — officers to narrow down suspect lists based on race. In a country so overly concerned about terrorism, this is going to lead to a lot of people being approached by law enforcement simply because of their ethnicity.

An additional problem with IBM’s software — and with those produced by competitors — is a lot of markers used to identify potential suspects can easily net a long list of probables who share nothing but similar body sizes or clothing preferences. Understandably, more work is done by investigators manning these systems before cops start rounding people up, but the potential for inadvertent misuse (never mind actual misuse) is still incredibly high.

The secrecy of these programs is also an issue. Restrictive NDAs go hand-in-hand with private sector partnerships and these are often translated by police officials to mean information must be withheld from judges, criminal defendants, and department oversight. When that happens, due process violations gather atop the privacy violation wreckage until the whole thing collapses under its own audacity. Nothing stays secret forever, but entities like the NYPD and IBM could do themselves a bunch of favors by engaging in a little proactive transparency.

Filed Under: , , ,
Companies: ibm

Rate this comment as insightful
Rate this comment as funny
You have rated this comment as insightful
You have rated this comment as funny
Flag this comment as abusive/trolling/spam
You have flagged this comment
The first word has already been claimed
The last word has already been claimed
Insightful Lightbulb icon Funny Laughing icon Abusive/trolling/spam Flag icon Insightful badge Lightbulb icon Funny badge Laughing icon Comments icon

Comments on “Documents Show IBM Pitched The NYPD Facial Recognition Software With Built-In Racial Profiling Options”

Subscribe: RSS Leave a comment
22 Comments
That Anonymous Coward (profile) says:

“It was specifically to avoid even the suggestion or appearance of any kind of technological racial profiling.”

Did a cooler head prevail by pointing out the sheer number of times y’all were racially profiling people with no evidence beyond skin tone & religious beliefs?

Did the huge piles of articles showing that other countries really don’t want your ‘experts’ on the scene of terrorism events b/c they immediately try to blame the brown guy?

Perhaps we would all be better off if we stopped having ‘secret’ programs that we hide from the view of the courts.
If you aren’t willing to bring it to court, why are you using it?
Is it because the court would tear your case apart??

This whole we can’t let the terrorists know how our programs work cover story is wearing really thin.
If you look a majority of these programs are at their heart the darker your skin the fewer rights you have.

Perhaps if we stopped the cops from treating those of darker skin tones so poorly, those communities would trust them more to speak out when they see something is wrong. As it stands now we’ve seen them sneak a CI into a mosque who so alarmed the members they called the FBI about him & they left the CI in place a while longer. Can you imagine how far over the top the CI needed to be that Muslims would reach out to the FBI (who used those all Muslims are evil training manuals until it was reported on) for help knowing that they themselves would most likely be on the receiving end of a rights trampling colonoscopy?

JoeCool (profile) says:

Riiiiiight

“Where such tools came with a test version of the product, the testers were instructed only to test other features (clothing, eyeglasses, etc.), but not to test or use the skin tone feature. That is not because there would have been anything illegal or even improper about testing or using these tools to search in the area of a crime for an image of a suspect that matched a description given by a victim or a witness. It was specifically to avoid even the suggestion or appearance of any kind of technological racial profiling.”

Because we all know how good the NYPD is at following directions. 😉

ShadowNinja (profile) says:

Racial profiling is worthless against so few terrorists

There’s a certain paradox that law enforcement should be aware of, that explains why this racial profiling software is bound to fail miserably in practice.

Imagine if there’s a deadly virus that has only infected a tiny percentage of the population (a fraction of a percent). If left untreated it could spread a ton of other people and cause massive harm. But lets say we have a test that’s 99% accurate at diagnosing who has this deadly disease.

You might think with those numbers that we could safely find everyone with the disease if everyone is willing to get screened for it. But that’s not true at all. Because such a tiny percentage of the population actually has the deadly disease, even a 99% accurate test for it would be worthless, it would flag far more people as having the disease then who actually have it. Even testing people a second time to verify the original test results wouldn’t stop this problem, with such a tiny percentage of the population actually infected you’d still find far more people who don’t have it then who do have it.

This is essentially the situation with stopping terrorists. Law enforcement loves to rely on the crux of racial profiling to find terrorists, but because there’s so few of them, and so many non-terrorists (over 300 million in the US alone), their racial profiling to find them becomes worthless because it will virtually always find non-terrorists instead.

Bamboo Harvester (profile) says:

Re: Racial profiling is worthless against so few terrorists

Well said, but your analogy isn’t quite right.

Would you “racial profile” your medical screening if the Killer Disease was Sickle Cell Anemia?

I’ve seen people actually rant about how some DRUGS are racist because MAO’s are “specified for Black Men!!!” over other BP drugs.

I think the outrage over this is overboard. The assumption is that crowds are scanned looking ONLY for “potential threats”.

Nobody wants to sit in front of monitors day in day out watching random crowds – video review is ALWAYS targeted.

If a store is robbed and the criminals are caught on video, skin (or head) color is a MAJOR factor in recognition software. Mainly because it usually removes 99% of the potentials.

I don’t know how fine the “color match” can be tuned, but I suspect it’s quite fine.

So, what’s next, cops will be banned from posting over the radio that their “suspect is a black male 25-30 wearing a gray hoodie”?

After all, that’s racist, sexist, and ageist profiling. Not to mention the fashion police….

OldMugwump (profile) says:

Nothing wrong with using skin color as a search qualifier

…if you’re searching for a specific suspect.

If you have a report that the suspect in question is 7’2" tall, with blue hair, purple eyes, and green skin, then that’s what you look for. That’s not racial profiling.

If you start hassling random green-skinned people for no reason, that’s racial profiling.

SirWired (profile) says:

It's a perfectly normal search constraint.

“Skin Color” is a perfectly normal thing to search for when trying to find a suspect for a particular crime; it’s no more problematic than gender, dress, facial hair, etc.

Of course it should not be used for “predictive” policing (e.g. “The more brown people, the more officers”), but if looking for a single person, it’s perfectly reasonable.

On another note, the whole bit about IBM and the Holocaust was completely overblown, nor was it any kind of secret. IBM did work on the German census, and religion was one of the questions on said census, but that is a perfectly normal census question. While it’s not on the US census, it most certainly is on census questionnaires in many countries even today, including such despotic regimes like Australia and the UK. (This usually comes up when some smart-ass tries to get ‘Jedi’ included in the official statistics.)

Anonymous Coward says:

Re: It's a perfectly normal search constraint.

It is now acceptable, iirc, down under to claim Jedi Knight as your official religion.

One has to wonder what it is about such questions that makes it something of importance, to whom ever thinks it important. Why does anyone in government need such information and to what purpose is it used?

Is something considered normal just because other people are doing it?

Zgaidin (profile) says:

Re: Re: It's a perfectly normal search constraint.

Like all information, demographic information about things like religion, sexual orientation, ethnicity, etc can be used for good or ill. In our rightly suspicious view of governments – it’s hard to see how it can possibly be good, but considering census data is public – it can actually be a powerful tool against governments. For example, how do you point out that a wildly disproportionate percentage of prison inmates in the US are black males, if you don’t know what percentage of the total population they constitute for comparative purposes?

Zgaidin (profile) says:

It Depends... ?

Yes, NYPD has a bad history of racial profiling and discrimination and they should be viewed with suspicion on those grounds. However, as others have pointed out, it depends entirely on how this software was being used.

If they were using it to identify “hot spots” of certain ethnic concentrations to inform officer deployment, yeah that’s bad. You can’t do that.

If they were retaining the footage so they could go back when searching for a specific subject, trying to find historical clues as to his or her whereabouts, that’s bad too – at least in my mind but for reasons unrelated to race. At that point, race is just a filter on the database query to speed the process.

If they’re using it on live footage to look for a specific suspect based on physical description, that just makes sense. If, however, they are NOT doing that “specifically to avoid even the suggestion or appearance of any kind of technological racial profiling” then that’s also bad, because it’s stupid. If you know your suspect is a “short, white male, 35-45, light brown hair,” don’t waste time having the system look at where all the tall, brown, black, quite young, quite old, or red-headed people are. That’s pointless.

Leave a Reply to Anonymous Coward Cancel reply

Your email address will not be published. Required fields are marked *

Have a Techdirt Account? Sign in now. Want one? Register here

Comment Options:

Make this the or (get credits or sign in to see balance) what's this?

What's this?

Techdirt community members with Techdirt Credits can spotlight a comment as either the "First Word" or "Last Word" on a particular comment thread. Credits can be purchased at the Techdirt Insider Shop »

Follow Techdirt

Techdirt Daily Newsletter

Ctrl-Alt-Speech

A weekly news podcast from
Mike Masnick & Ben Whitelaw

Subscribe now to Ctrl-Alt-Speech »
Techdirt Deals
Techdirt Insider Discord
The latest chatter on the Techdirt Insider Discord channel...
Loading...