Company That Turned ‘Excited Delirium’ Into A Thing Thinks It Can Prevent School Shootings With Drone-Mounted Tasers [UPDATED]
from the maybe-just-shut-the-fuck-up dept
UPDATE: Since this post’s composition over the weekend, there has been a notable development. Axon has, for the moment, pulled the ends of its toes from overhanging the precipice. It only took the resignation of most of the Ethics Board (nine of twelve members) to force the company to reconsider its move towards offering schools access to armed drones.
But this statement from Axon’s CEO seems to imply the company is still willing to crane its neck to peer over the precipice until it’s socially feasible to go over the edge of it.
Axon’s founder and CEO Rick Smith said the company’s announcement last week — which drew a rebuke from its artificial intelligence ethics board — was intended to “initiate a conversation on this as a potential solution.” Smith said the ensuing discussion “provided us with a deeper appreciation of the complex and important considerations” around the issue.
As a result, “we are pausing work on this project and refocusing to further engage with key constituencies to fully explore the best path forward,” he said.
“Pausing” is not “scrapping.” This suggests the company is still exploring the option, but isn’t willing to fight the current membership of its Ethics Board over it. If the nine resigned members un-resign, the pause should be indefinite. If Axon decides to replace these members with ones more willing to let Axon do what it wants to, the project could be un-paused in the near future.
[Original post follows below.]
Axon — the company that crafted a pseudo-scientific form of plausible deniability for cops who’ve killed people — now wants to modify the ever-popular (and patently ridiculous) maxim “The only person who can stop a bad guy with a gun is a good guy with a gun.”
There were plenty of “good” guys with guns present at the last major school shooting. They did nothing to stop the killing. Instead, they huddled a safe distance away until another law enforcement agency showed up to actually stop the school shooter.
If law enforcement can’t handle mass shootings quickly and competently (and agencies have given us little indication that they can), they certainly shouldn’t be entrusted with an airborne weapon now being pushed by a company that sees shootings like the one in Uvalde, Texas as another way to bump year-over-year sales increases.
Here’s the EFF’s take on this announcement of Axon’s armed drone proposal:
Taser and surveillance vendor Axon has proposed what it claims to be the solution to the epidemic of school shootings in the United States: a remote-controlled flying drone armed with a taser. For many many reasons, this is a dangerous idea. Armed drones would mission-creep their way into more every-day policing. We must oppose a process of normalizing the arming of drones and robots.
Police currently deploy many different kinds of moving and task-performing technologies. These include flying drones, remote control bomb-defusing robots, and autonomous patrol robots. While these different devices serve different functions and operate differently, none of them—absolutely none—should be armed with any kind of weapon.
Here’s Axon’s far more cheery take on the addition of Taser devices to drones:
Put together, these two technologies may effectively combat mass shootings. In brief, non-lethal drones can be installed in schools and other venues and play the same role that sprinklers and other fire suppression tools do for firefighters: Preventing a catastrophic event, or at least mitigating its worst effects.
The press release and Axon’s blog post say plenty of nice things about clean, ethical disarming of mass shooters. But neither acknowledge the statement made by Axon’s Ethics Board, which is firmly opposed to Axon moving forward with this plan.
Having… deliberated at length, a majority of the Ethics Board last month ultimately voted against Axon moving forward, even on those limited terms.
Axon’s decision to announce publicly that it is proceeding with developing TASER-equipped drones and robots to be embedded in schools, and operated by someone other than police, gives us considerable pause. It is a notable expansion of what the Board discussed at length. Axon’s announcement came before the company even began to find workable solutions to address many of the Board’s already-stated concerns about the far more limited pilot we considered, and before any opportunity to consider the impact this technology will have on communities. Now, Axon has announced it would not limit the technology to policing agencies, but would make it more widely available. And the surveillance aspect of this proposal is wholly new to us. Reasonable minds can differ on the merits of police-controlled TASER-equipped drones — our own Board disagreed internally — but we unanimously are concerned with the process Axon has employed regarding this idea of drones in school classrooms.
So, why even bother having an Ethics Board? If their contribution ultimately means nothing, Axon is free to do whatever it wants while still pretending it has a conscience. If it has overridden this decision, it’s free to walk back its Ethics Board-guided decision to keep its body cameras free of facial recognition tech.
The downsides of this program are immense. The Ethics Board knows this. Axon knows it too. It just doesn’t appear to care. It turns kids into surveillance subjects and operates on the wholly unproven concept that a drone with a Taser is capable of incapacitating a mass shooter. And while the rollout is still months or years away, Axon appears to believe it’s actually capable of providing a solution to a problem the United States has steadfastly refused to solve — one wholly reliant on its products.