LAPD Blames Predictive Software For Misconduct And Abuse, Rather Than Its Own Disinterest In Holding Officers Accountable
from the it's-never-a-cop's-fault dept
As long as we're heading into an age of predictive policing, it's good to know that some police departments are willing to turn the ThoughtCrime scanner on their own employees.
Police departments across the U.S. are using technology to try to identify problem officers before their misbehavior harms innocent people, embarrasses their employer, or invites a costly lawsuit — from citizens or the federal government.Of course, some of this is just "insider threat" detection that ousts whistleblowers before they can blow the whistle and punishes employees for not adhering to the prevailing mindset. Nothing about this software is anywhere close to perfect, but it's still being used to (hopefully) head off police misconduct before it occurs. But what the system flags doesn't seem to be stopping cops before they do something regrettable.
The systems track factors such as how often officers are involved in shootings, get complaints, use sick days and get into car accidents. When officers hit a specific threshold, they're supposed to be flagged and supervisors notified so appropriate training or counseling can be assigned.Proponents of the system point out that its largest value is as a deterrent. Even so, it's still relatively worthless.
The Los Angeles Police Department agreed to set up their $33 million early warning systems after the so-called Rampart scandal in which an elite anti-gang unit was found to have beaten and framed suspected gang members. The system was then implemented in 2007.The LAPD presents this as a software failure -- and some of it is. What's being flagged isn't necessarily indicative of potential misconduct. But beyond the algorithm, there's this integral part which is being ignored.
The LAPD's inspector general found in a recent review that the system was seemingly ineffective in identifying officers who ultimately were fired. The report looked at 748 "alerts" over a four-month period and found the agency took little action in the majority of cases and only required training for 1.3 percent, or 10 alerts, of them.
Experts say the early warning system can be another powerful tool to help officers do their jobs and improve relations, but it is only as good as the people and departments using it… "These systems are designed to give you a forewarning of problems and then you have to do something."Even the IG's report notices nothing's being done. 748 "alerts" only resulted in action on 10 of them. The LAPD is trying to portray this as a software failure, most likely in hopes of ditching the system that was forced on it by its own bad behavior. (The irony here is that police departments will argue that predictive policing software doesn't work on cops but does work on citizens.)
But it's not just the software. It's the LAPD. Long before the Rampart Scandal of the late 90s uncovered massive corruption in the force, the LAPD's Internal Affairs department was doing absolutely nothing to hold officers accountable for misconduct.
The Christopher Commission (1991) in Los Angeles found that the Internal Affairs Division (IAD) of the LAPD had sustained only 2 percent of the excessive force complaints and stated: "Our study indicates that there are significant problems with the initiation, investigation, and classification of complaints." It called the IAD investigations "unfairly skewed against the complainant."More recent reports [pdf] still show that public complaints are almost never sustained (3.5%). Even factoring in the much higher rate given to complaints from other officials and officers (45%), the overall rate still routinely sits near 10%.
This isn't just Los Angeles. Overall, the nation's law enforcement agencies are only sustaining 8% of complaints. Officers have seemingly unlimited "strikes" before misconduct costs them their jobs. Combine that with the low sustain rate and officers know they can get away with a lot before they receive any discipline.
And if that isn't enough, these flagging systems create their own perverse incentives:
A 2011 Justice Department report found the New Orleans Police Department's system, adopted roughly two decades ago, was "outdated and essentially exists in name only." Investigators said information was included haphazardly and flagged officers were put into essentially "bad boy school," a one-size-fits-all class seen by some as a badge of honor.No doubt more than a few New Orleans residents received a few extra nightstick swings/Taser shots just so Officer X could hang with the big boys. Fun stuff.
But on the other side of the coin lies the LA Sheriff's Department -- at least in terms of predictive software.
The sheriff's department has an early warning system. "Our diagnostic systems were fine," said the department's Chief of Detectives, Bill McSweeney, who advised his agency on creation of the warning system. "Our managerial and supervision response was not fine. It's that simple."The LASD is finally acknowledging that it let its officers (and prison guards) act with impunity for far too many years. The system could have worked -- at least in its limited capabilities -- but no one wanted to follow up on flagged officers. The situation there has deteriorated to the point that the LASD is looking at a few years of federal supervision.
Predictive policing is still a bad idea, even for policing police. While data may help pinpoint problem areas, the flagging systems are far too inaccurate to guarantee hits. But the problem within law enforcement agencies is the lack of accountability, not faulty software. Unless the first problem is addressed, it won't matter how much the software improves in the future.