City Of Chicago, Chicago PD Officers Sued Over Their Use Of Questionable ShotSpotter Technology
from the looking-forward-to-discovery dept
ShotSpotter has made a name for itself (and not a good one!) by telling cops its sensors and mics can convert soundwaves into actionable intel. It has become beloved by some cops, perhaps in part for being willing to alter its reports to reflect what cops want to believe happened, rather than what actually may have happened.
In one case in Rochester, New York, resident Silvon Simmons was shot three times in the back by Rochester PD officer Joseph Ferrigno. The PD approached ShotSpotter and asked it to take another look at its data in hopes of excusing the officer’s otherwise unjustified deadly force. ShotSpotter apparently complied.
A Rochester police officer acknowledged at the criminal trial of Silvon Simmons that he left the shooting scene after midnight, returned to the fourth floor at the Central Investigations Division of the Rochester Police Department, logged onto a computer and opened a chat session with Shotspotter, where he provided the location, time, the number of possible shots and the caliber of the weapons allegedly fired. Officer Robert Wetzel also testified that Shotspotter responded to him that they found a fifth gunshot at his request.
This conclusion was based solely upon information provided to Shotspotter by the Rochester Police Department.
It happened again, this time to the Chicago resident whose experience with ShotSpotter’s shifting “analysis” resulted in his wrongful arrest (and forms the basis for this lawsuit). Here’s how that went down:
On May 31 last year, 25-year-old Safarain Herring was shot in the head and dropped off at St. Bernard Hospital in Chicago by a man named Michael Williams. He died two days later.
Chicago police eventually arrested the 64-year-old Williams and charged him with murder (Williams maintains that Herring was hit in a drive-by shooting). A key piece of evidence in the case is video surveillance footage showing Williams’ car stopped on the 6300 block of South Stony Island Avenue at 11:46 p.m.—the time and location where police say they know Herring was shot.
[A]fter the 11:46 p.m. alert came in, a ShotSpotter analyst manually overrode the algorithms and “reclassified” the sound as a gunshot. Then, months later and after “post-processing,” another ShotSpotter analyst changed the alert’s coordinates to a location on South Stony Island Drive near where Williams’ car was seen on camera.
Based on this revelation, prosecutors decided to withdraw all the ShotSpotter evidence — the sole thing “linking” Williams to the killing. By that point, Williams had already spent nearly a year in jail. (Vice/Motherboard was sued over this reporting by ShotSpotter, but recently a Delaware judge ruled the reporting was “substantially truthful” [while quoting ShotSpotter’s own in-court testimony], en route to tossing the company’s defamation suit.)
A few weeks after this debacle, the Chicago PD’s Inspector General delivered a report showing ShotSpotter was nothing more than expensive guesswork which had contributed almost nothing to the PD’s efforts to reduce gun crime. In fact, the only thing it seems to be “helping” was cops who liked performing unconstitutional stops who would use ShotSpotter reports to justify stops and searches simply because shots had been reported in that general area at literally any given time in the past.
While some cities have dumped the dubious tech, the City of Chicago appears to be in no hurry to do so. Now, the city is getting sued (along with a long list of Chicago PD officers) by public interest groups who want to see public entities held accountable for using questionable tech to destroy lives. Here’s Mack DeGeurin, summarizing the case for Gizmodo.
A federal lawsuit filed against the city of Chicago is calling into question law enforcement’s use of controversial gunshot detection technology for gathering key pieces of evidence. The suit, filed this week by the MacArthur Justice Center at Northwestern University’s law school, alleged Chicago police misused inaccurate and unreliable tech from the firm ShotSpotter and didn’t pursue additional leads. The suit accuses police of putting “blind faith,” in ShotSpotter’s supposed evidence.
The lawsuit, first reported by The Associated Press, revolves around a 2020 shooting in Chicago that left one man dead. Police linked that alleged killing to a 65-year-old man named Michael Williams using two pieces of evidence. Police reportedly obtained a noiseless security video of William’s vehicle passing through an intersection which was then linked to a gunshot supposedly detected by ShotSpotter’s system. Williams was arrested in 2021 and spent nearly a year in jail before a judge finally dismissed his case after prosecutors admitted they lacked sufficient evidence.
The lawsuit filed by the MacArthur Justice Center at Northwestern University’s law school seeks damages from the city for mental anguish, loss of income and legal bills for the 65-year-old Williams, who said he still suffers from a tremor in his hand that developed while he was locked up. It also details the case of a second plaintiff Daniel Ortiz, a 36-year-old father who the lawsuit alleges was arbitrarily arrested and jailed by police who were responding to a ShotSpotter alert.
The suit seeks class-action status for any Chicago resident who was stopped on the basis of the alerts. And among other things, it seeks a court order barring the technology’s use in Chicago, the nation’s third-largest city.
The suit details ShotSpotter’s apparent uselessness, even according to CPD officers who use it to engage in biased policing.
Every day in Chicago, Chicago Police Department (CPD) officers are deployed around 100 times to chase down alerts of supposed gunfire generated by ShotSpotter. More than 90% of the time they find no indication of any gun-related incident, according to the City’s own data. The result is more than 31,600 unfounded CPD deployments every year because of ShotSpotter—more than 87 on an average day.
Some cops may complain their time has been wasted. For many other officers, time you enjoy wasting isn’t wasted time.
CPD has intentionally misused ShotSpotter alerts to make scores of illegal stops and arrests. CPD officers, chasing down unfounded ShotSpotter alerts, have stopped and detained thousands of innocent Chicagoans who happened to be near the location of an alert. CPD officers have used ShotSpotter’s presence on the South and West sides of the City as justification for aggressive police tactics—treating residents as suspects, detaining them, and frisking them just because there has supposedly been a history of ShotSpotter alerts in the area. CPD officers have even used ShotSpotter alerts as a basis to falsely accuse people of crimes.
As the plaintiffs argue, the City of Chicago is just as culpable as the sued police officers. It has continued to pay for ShotSpotter’s services under the theory it’s a public safety essential but then refuses to allow defendants to challenge the ShotSpotter evidence. Prosecutors dump cases whenever defendants demand information related to ShotSpotter accuracy and reliability.
The Municipal Defendants know, based on public testimony in City Council and the experience of their own employees, that Cook County prosecutors drop cases involving ShotSpotter evidence and avoid defending the system’s reliability in court when it is challenged in criminal cases. The Municipal Defendants and other city managers have been repeatedly confronted in public, in City Council, and in other forums with the many harms that the system imposes on the predominantly Black and Latinx people who live in neighborhoods under ShotSpotter’s surveillance.
In spite of this knowledge, the Municipal Defendants continue to fund and use ShotSpotter to dispatch CPD officers on tens of thousands of unwarranted deployments every year. They continue to direct CPD officers to rely on ShotSpotter alerts in stopping and detaining civilians, without cause.
ShotSpotter is Chicago’s useful idiot: good enough to justify arrests but too unreliable to be counted on in court. That undeniable inconsistency isn’t going to help the city in this lawsuit. Neither are the hit rates (officers filing criminal incident reports) reported by other cities using the same tech, which range from a pathetic 5% (Dayton, Ohio) to an abysmal 0.5% (Minneapolis, Minnesota).
ShotSpotter has rigged the game in its favor, which means city residents lose while the company can continue to make unfounded accuracy claims while collecting a steady paycheck.
The City’s contract with ShotSpotter only requires that for every 10 alerts that ShotSpotter sends out, ShotSpotter can receive no more than one complaint from CPD about a gunshot that CPD discovered but which ShotSpotter missed or mislocated by more than 25 meters. Only these missed or mislocated gunshots count against ShotSpotter. False positive alerts and situations where CPD pursues a ShotSpotter alert and finds nothing to corroborate gunfire never count against ShotSpotter under the contract.
ShotSpotter can thus guarantee that it will always hit its performance target of one complaint per ten alerts simply by putting a thumb on the scale in favor of triggering alerts to loud noises that are not gunfire. Excess but unfounded alerts help ShotSpotter meet its performance target because they overwhelm any complaints ShotSpotter receives from CPD about missed or mislocated gunshots, and they never count against the company.
This works for the city, as well, which explains its continued reliance on questionable tech. It looks better than doing nothing about gun violence and gives the city (and ShotSpotter) a handy detection statistic to quote while providing enough plausible deniability to ignore the system’s inability to do anything about violent crime rates.
Underneath it all is a system of lab coats, clipboards, and complex equations scrawled on blackboards. It looks like science but it has very little to do with actual science.
According to the technical paper, some of the tiles in the image mosaic encode information that appears to have nothing to do with the actual sound that ShotSpotter’s microphones pick up. For example, one tile visually depicts the “location of recent nearby incidents.” Another tile visually depicts the number of “recent incidents.” That is, the mosaic includes information about prior noises rather than the specific noise that is being analyzed. By including such extraneous information in the image mosaic, ShotSpotter’s algorithms may be more likely to classify noises as gunshots if there have been more “nearby” or “recent” ShotSpotter alerts in the area. In this way, the system may be more likely to trigger false alerts in places where it has previously triggered alerts, creating harmful and inaccurate feedback loops that falsely inflate the number of ShotSpotter alerts in particular areas
Discovery should be fun if this lawsuit survives long enough to compel production from ShotSpotter. I’m sure the city and gunshot phrenology provider would prefer no internal information (from the city or ShotSpotter) is made public and will try to buy some silence. But we can always hope. And we can definitely hope this is the first of several similar lawsuits that will force cities to drop ShotSpotter or force ShotSpotter to be more honest about the limitations of its very questionable offering.