Palantir's Law Enforcement Data Stranglehold Isn't Good For Police Or The Policed

from the all-the-problems,-none-of-the-accountability dept

Palantir has made government surveillance big business. It’s a multi-billion dollar company built mainly on government contracts. Its tech prowess and computing power have made it the go-to company for data harvesting and many of its most loyal customers are local law enforcement agencies.

Mark Harris of Wired has put together a fascinating expose of the company’s work with US law enforcement based on documents obtained via FOIA requests. What’s uncovered does little to alter Palantir’s reputation as an enemy of personal privacy. What’s added to this rep isn’t any more flattering: the documents show Palantir handles data carelessly, ties customers into overpriced support/upgrades, and otherwise acts as though it has to answer to no one.

In one case, files marked as sensitive by a Long Beach drug squad detective were still accessible by other officers who shouldn’t have had access. Multiple emails to Palantir failed to resolve the issue. Making it worse was the fact the problem couldn’t be contained in-house. When agencies sign up for Palantir services, they’re given heavily-discounted rates if they allow their data to be shared with other law enforcement agencies. Detectives hoping to protect sensitive sources and undercover cops from outside access were finding out their employers had signed that option away in exchange for cheaper initial pricing.

That’s just the beginning of Palantir problems uncovered by these public document requests:

In the documents our requests produced, police departments have also accused the company, backed by tech investor and Trump supporter Peter Thiel, of spiraling prices, hard-to-use software, opaque terms of service, and “failure to deliver products” (in the words of one email from the Long Beach police). Palantir might streamline some criminal investigations—but there’s a possibility that it comes at a high cost, for both the police forces themselves and the communities they serve.

These documents show how Palantir applies Silicon Valley’s playbook to domestic law enforcement. New users are welcomed with discounted hardware and federal grants, sharing their own data in return for access to others’. When enough jurisdictions join Palantir’s interconnected web of police departments, government agencies, and databases, the resulting data trove resembles a pay-to-access social network—a Facebook of crime that’s both invisible and largely unaccountable to the citizens whose behavior it tracks.

Palantir encourages the use of predictive policing. By analyzing data from past incidents and arrests, agencies are supposed to be able to identify “hot spots” where criminal activity is likely to occur and step up patrols in those areas. There are several problems with this approach, not the least of which is the latent encouragement of profiling by officers patrolling these areas, who are likely to view everyone they approach as a criminal suspect, rather than someone who just lives or works in a software-generated “hot spot.”

But the problems go deeper than that with Palantir involved. Predictive policing is data-driven. But it is also a victim of circular logic. If predictive policing doesn’t appear to be having much effect, the usual solution is to feed it more data. Palantir’s predictive algorithms are particularly data-hungry. Officers patrolling hot spots are required to fill out heavily-detailed encounter reports, detailing everything they can about the person spoken to, as well as anything else observed in that area. This is all fed into Palantir’s predictive policing software.

At this point, the gathering of data became so streamlined, law enforcement agencies have begun allowing Palantir to swallow up other law enforcement databases — namely CLETS (California Law Enforcement Telecommunications Systems) — and allow it to crunch idata into something actionable. Sure enough, Palantir’s software has coughed up… something. But tips as bad as these should come from unvetted informants and questionable eyewitnesses, not multimillion dollar programs.

In February 2013, JRIC was tasked with tracking down Christopher Dorner, an ex-LAPD officer who had embarked on a series of shootings targeting law enforcement officers. The effort involved dozens of agencies across the state. “We used Palantir extensively to address that [and] were active 24/7 until he was caught or killed,” remembers Jackson. “We found that processing clues was a big challenge.”

In fact, on two separate occasions, police shot at trucks misidentified as belonging to Dorner, injuring three civilians.

A larger problem, at least in terms of personal privacy, is the potential for abuse. Smaller data silos meant unauthorized use/access of law enforcement databases could at least be somewhat mitigated by the limitations of the database itself. Now, with multiple agencies tied together through Palantir’s data sharing (along with its swallowing of existing law enforcement databases), those wishing to abuse their access have a much larger dataset to dig through.

In the end, someone has to pay for all this data. And, man, will they ever. Obtained documents and interviews with officials show Palantir seduces law enforcement with low introductory prices before ratcheting up the fees once they have nowhere else to go.

According to LA County contracts, when JRIC committed to the full Palantir system in October 2011, the LASD paid around $122,000 each for 20 Palantir “cores”: packages of already-configured computer servers bundled with preinstalled software. That price was approximately $19,000 less per core than Palantir charged the federal government. According to paperwork for the pilot program, LASD received a “special discount because it [would] be the first in the LA basin to use this software.”


Palantir’s customers must rely on software that only the company itself can secure, upgrade, and maintain. Although the letter noted Palantir had not provided JRIC with any of its requested (but unspecified) metrics by spring 2016, the company is set to receive annual maintenance payments of nearly $2.5 million from the fusion center through the spring of 2019.

That’s taxpayer money being fed to a single-source contractor whose end goal is to tie everyone to everyone else using steep discounts predicated on data sharing. And it appears to be drowning in data, with no customer able to point to positive, real-world changes that can be conclusively linked to Palantir’s law enforcement software. But it’s too late to do anything about it. In California, law enforcement agencies bought cheap and surrendered control. It’s likely happening elsewhere in the nation, but the paper trail has yet to be exposed. Citizens, of course, are the ones paying for all of this, not only with their tax dollars but with their individuality, having been reduced to data points in a stream of alleged criminal activity held by a private party that’s probably already imagining secondary markets for its law enforcement data stores.

Filed Under:
Companies: palantir

Rate this comment as insightful
Rate this comment as funny
You have rated this comment as insightful
You have rated this comment as funny
Flag this comment as abusive/trolling/spam
You have flagged this comment
The first word has already been claimed
The last word has already been claimed
Insightful Lightbulb icon Funny Laughing icon Abusive/trolling/spam Flag icon Insightful badge Lightbulb icon Funny badge Laughing icon Comments icon

Comments on “Palantir's Law Enforcement Data Stranglehold Isn't Good For Police Or The Policed”

Subscribe: RSS Leave a comment
That Anonymous Coward (profile) says:

Named for the device used to spy on Hobbit Beauty Queens

It’s a pity that a company that offered to generate evidence to order for a price now is treated like the holy grail of evidence & honesty.

We really need to start demanding more accountability from public officials who see a glitzy promo & sign away millions for the right to pay even more.

We’ve seen billions funneled into pipe-dreams because the presentation took place in a room modeled after a star ship, rather than based on actual output.

Exposing CI’s and undercover officers is completely unacceptable & rather than e-mailing someone needed to sue. If they can’t bother to keep data marked sensitive sequestered, how the fuck can we trust them to not mishandle all of the other data they are gobbling up for an unproven system that will predict all the crime.

Rashad Ghaffar (user link) says:

Police transparency & accountable

The best way to hold police accountable is to do it ourselves. I’ve started a company and published an app called Shield Report. It’s a free app that allows you to rate an officer’s job performance after an interaction. The data from the completed survey, combined with user demographic info, is stored in the app’s reporting feature. There you can create and customize your own reports on a specific officer, precinct, or an entire state. Check it out at the App Store and google play by searching shieldreport, or go to for more information.

Add Your Comment

Your email address will not be published. Required fields are marked *

Have a Techdirt Account? Sign in now. Want one? Register here

Comment Options:

Make this the or (get credits or sign in to see balance) what's this?

What's this?

Techdirt community members with Techdirt Credits can spotlight a comment as either the "First Word" or "Last Word" on a particular comment thread. Credits can be purchased at the Techdirt Insider Shop »

Follow Techdirt

Techdirt Daily Newsletter

Techdirt Deals
Techdirt Insider Discord
The latest chatter on the Techdirt Insider Discord channel...