Axon Wants Its Body Cameras To Start Writing Officers’ Reports For Them

from the do-we-really-want-cops-with-more-free-time? dept

Taser long ago locked down the market for “less than lethal” (but still frequently lethal) weapons. It has also written itself into the annals of pseudoscience with its invocation of not-an-actual-medical condition “excited delirium” as it tried to explain away the many deaths caused by its “less than lethal” Taser.

These days Taser does business as Axon. In addition to separating itself from its troubled (and somewhat mythical) past, Axon’s focus has shifted to body cameras and data storage. The cameras are the printer and the data storage is the ink. The real money is in data management, and that appears to be where Axon is headed next. And, of course, like pretty much everyone at this point, the company believes AI can take a lot of the work out of police work. Here’s Thomas Brewster and Richard Nieva with the details for Forbes.

On Tuesday, Axon, the $22 billion police contractor best known for manufacturing the Taser electric weapon, launched a new tool called Draft One that it says can transcribe audio from body cameras and automatically turn it into a police report. Cops can then review the document to ensure accuracy, Axon CEO Rick Smith told Forbes. Axon claims one early tester of the tool, Fort Collins Colorado Police Department, has seen an 82% decrease in time spent writing reports. “If an officer spends half their day reporting, and we can cut that in half, we have an opportunity to potentially free up 25% of an officer’s time to be back out policing,” Smith said.

If you don’t spend too much time thinking about it, it sounds like a good idea. Doing paperwork consumes a large amounts of officers’ time and a tool that automates at least part of the process would, theoretically, allow officers to spend more time doing stuff that actually matters, like trying to make a dent in violent crime — the sort of thing cops on TV are always doing but is a comparative rarity in real life.

It’s well-documented that officers spend a large part of their day performing the less-than-glamorous function of being an all-purpose response to a variety of issues entirely unrelated to the type of crimes that make headlines and fodder for tough-on-crime politicians.

On the other hand, when officers are given discretion to handle crime-fighting in a way they best see fit, they almost always do the same thing: perform a bunch of pretextual stops in hopes of lucking into something more criminal than the minor violation that triggered the stop. A 2022 study of law enforcement time use by California agencies provided these depressing results:

Overall, sheriff patrol officers spend significantly more time on officer-initiated stops – “proactive policing” in law enforcement parlance – than they do responding to community members’ calls for help, according to the report. Research has shown that the practice is a fundamentally ineffective public safety strategy, the report pointed out.

In 2019, 88% of the time L.A. County sheriff’s officers spent on stops was for officer-initiated stops rather than in response to calls. The overwhelming majority of that time – 79% – was spent on traffic violations. By contrast, just 11% of those hours was spent on stops based on reasonable suspicion of a crime.

In Riverside, about 83% of deputies’ time spent on officer-initiated stops went toward traffic violations, and just 7% on stops based on reasonable suspicion.

So, the first uncomfortable question automated report writing poses is this: what are cops actually going to do with all this free time? If it’s just more of this, we really don’t need it. All AI will do is allow problematic agencies and officers to engage in more of the biased policing they already engage in. Getting more of this isn’t going to make American policing better and it’s certainly not going to address the plethora of long-standing issues American law enforcement agencies have spent decades trying to ignore.

Then there’s the AI itself. Everything at use at this point is still very much in the experimental stage. Auto-generated reports might turn into completely unusable evidence, thanks to the wholly expected failings of the underlying software.

These reports, though, are often used as evidence in criminal trials, and critics are concerned that relying on AI could put people at risk by depending on language models that are known to “hallucinate,” or make things up, as well as display racial bias, either blatantly or unconsciously.

That’s a huge problem. Also problematic is the expected workflow, which will basically allow cops to grade their own papers by letting the AI handle the basics before they step in and clean up anything that doesn’t agree with the narrative an officer is trying to push. This kind of follow-up won’t be optional, which also might mean some agencies will have to allow officers to review their own body cam footage — something they may have previously forbidden for exactly this reason.

On top of that, there’s the garbage-in, garbage-out problem. AI trained on narratives provided by officers may take it upon themselves to “correct” narratives that seem to indicate an officer may have done something wrong. It’s also going to lend itself to biased policing by tech-washing BS stops by racist cops, portraying these as essential contributions to public safety.

Of course, plenty of officers do these sorts of things already, so there’s a possibility it won’t make anything worse. But if the process Axon is pitching makes things faster, there’s no reason to believe what’s already wrong with American policing won’t get worse in future. And, as the tech improves (so to speak), the exacerbation of existing problems and the problems introduced by the addition of AI will steadily accelerate.

That’s not to say there’s no utility in processes that reduce the amount of time spent on paperwork. But it seems splitting off a clerical division might be a better solution — a part of the police force that handles the paperwork and vets camera footage, but is performed by people who are not the same ones who captured the recordings and participated in the traffic stop, investigation, or dispatch call response.

And I will say this for Axon: at least its CEO recognizes the problems this could introduce and suggests agencies limit automated report creation to things like misdemeanors and never in cases where deadly force is deployed. But, like any product, it will be the end users who decide how it’s used. And so far, the expected end users are more than willing to streamline things they view as inessential, but are far less interested in curtailing abuse by those using these systems. Waiting to see how things play out just isn’t an acceptable option — not when there are actual lives and liberties on the line.

Filed Under: , ,
Companies: axon

Rate this comment as insightful
Rate this comment as funny
You have rated this comment as insightful
You have rated this comment as funny
Flag this comment as abusive/trolling/spam
You have flagged this comment
The first word has already been claimed
The last word has already been claimed
Insightful Lightbulb icon Funny Laughing icon Abusive/trolling/spam Flag icon Insightful badge Lightbulb icon Funny badge Laughing icon Comments icon

Comments on “Axon Wants Its Body Cameras To Start Writing Officers’ Reports For Them”

Subscribe: RSS Leave a comment
17 Comments

This comment has been flagged by the community. Click here to show it.

Koby (profile) says:

Vector For Subversion

We know that some platforms are auto-generating captions and transcripts. And then some content creators speak in lingo in order to defeat the transcripts and the algorithm. I wonder if civilians could develop a form of spoken slang that might wildly mislead the report generator? Inattentive police might risk introducing wild transcription errors into a report without spending additional time making numerous corrections.

Anonymous Coward says:

Cops can then review the document to ensure accuracy.

If they’re smart at Axom, they would only offer this as a premium subscription.
The basic subscription tier would generate the same random gibberish but would attach the officer name, picture and generated signature, and this would leak at the next data breach. It seems they’ve already tested the latter.

Chris Brand says:

Expected workflow problem

If this is billed as “reducing paperwork for cops” and it changes their job from “transcribing the audio” to “checking the AI-generated transcription” then surely they were already reviewing the recordings, and if that review is forbidden then the tool just wouldn’t be applicable…

This comment has been flagged by the community. Click here to show it.

Anonymous Coward says:

This isn't going to work in the real world

I happen to have been involved with a transcription service (that uses humans). We contracted with them to deal with rather a lot of mixed audio, some of it recorded under ideal conditions (single speaker, noise-free environment) and some of it recorded under terrible conditions (multiple speakers, noisy environment).

And even these skilled, experienced people struggled with the latter, often requiring multiple passes through the audio with EQ and other filters in order to (partially) isolate speakers.

So what’s going to happen when this software is handed an audio track that has multiple cops shouting, multiple civilians shouting, sirens, vehicle sounds, background sounds, multiple languages, slang, etc.?

And when it doesn’t work, and its output is subsequently introduced in court as “evidence”, it will create a bias (conscious or unconscious) that because it was generated by a computer, it must be right, and witnesses who dispute it will be viewed as less than credible.

PaulT (profile) says:

Re:

I mean, the easy fix for that is to always allow access to the original recording and allow alternative transcripts as evidence.

I know why that probably won’t happen and why the cops will try to block it, but it should be a default that anything that’s generated by AI already has a bias, conscious or not. It could be a good first draft, but if you’re entering it into a court you should demand better quality (yes, I know that often won’t happen)

Anonymous Coward says:

On the one hand… one might actually find police reports without a lot of made-up bullshit for five minutes until the cops re-write it. (But AI transcription… just lol.)

On the other hand, as bad as most cops already are, there’s probably some decent exercise in filing reports that cops need.

On the third hand, most cops outside densely populated areas aren’t doing squat 95% of the time anyway, why take away another thing that justifies their job?

Add Your Comment

Your email address will not be published. Required fields are marked *

Have a Techdirt Account? Sign in now. Want one? Register here

Comment Options:

Make this the or (get credits or sign in to see balance) what's this?

What's this?

Techdirt community members with Techdirt Credits can spotlight a comment as either the "First Word" or "Last Word" on a particular comment thread. Credits can be purchased at the Techdirt Insider Shop »

Follow Techdirt

Techdirt Daily Newsletter

Subscribe to Our Newsletter

Get all our posts in your inbox with the Techdirt Daily Newsletter!

We don’t spam. Read our privacy policy for more info.

Ctrl-Alt-Speech

A weekly news podcast from
Mike Masnick & Ben Whitelaw

Subscribe now to Ctrl-Alt-Speech »
Techdirt Deals
Techdirt Insider Discord
The latest chatter on the Techdirt Insider Discord channel...
Loading...