AI Lawyer Will Represent Client In Traffic Court, Threatening Nonexistent Market For Traffic Court Lawyers

from the if-it-can-make-it-there,-can-it-make-it-anywhere? dept

It’s the rise of the lawbots, something not even foretold by Futurama, which allowed a “simple hyper-chicken from a backwoods asteroid” to perform much of the series’ criminal justice work.

AI-in-everything is on the rise. And that includes lowball court cases, as Lauren Leffer reports for Gizmodo.

An AI-based legal advisor is set to play the role of a lawyer in an actual court case for the first time. Via an earpiece, the artificial intelligence will coach a courtroom defendant on what to say to get out of the associated fines and consequences of a speeding charge, AI-company DoNotPay has claimed in a report initially from New Scientist and confirmed by Gizmodo.

The in-person speeding ticket hearing is scheduled to take place in a U.S. courtroom (specifically, not California) sometime in February, DoNotPay’s founder and CEO Joshua Browder told Gizmodo in a phone call. However, Browder and the company wouldn’t provide any further case details to protect the defendant’s privacy.

AI legal representation in an actual court case will be happening. But, as the saying goes, who knows where or when. The smart money is on February, at least according to the New Scientist report linked to by Gizmodo. Where remains a mystery, but it’s traffic court, so the location doesn’t really matter.

Browder’s DoNotPay bot has been around for a few years at this point. It was originally created to help people fight bogus parking tickets, a suitably low stakes environment for testing the power of the legal AI. It experienced a lot of early success — a 64% success rate in 250,000 cases involving more than $4 million in fines. But that targeted a legal arena where challenges are anomalies and the stakes low enough cities will often dismiss tickets rather than bear the expense of addressing challenges by drivers.

The same thing applies to traffic court. The stakes are low. The odds of success are rather high, considering it can take nothing more than the ticketing officer’s inability to attend court to secure a win for the driver.

And there’s nothing wrong with providing AI assistance to people who have nothing more than a bit of money at stake. Whatever helps even the odds is a welcome addition to a process that pretty much ignores the presumption of innocence mainly because so few people bother to take the time to show up in court to challenge tickets.

That Browder has decided his AI might be capable of securing people’s literal freedom is more concerning. In 2017, he added functionality to assist immigrants with their asylum applications. Immigration law is much more complex than traffic law, and there’s a good chance the use of Browder’s AI may have made things worse for some applicants simply because there are a lot more inscrutable variables involved.

That being said, asking an AI to defend you in traffic court is a good test environment that is likely to have little effect on life or liberty, no matter the outcome. But not all vehicle infractions are low stake. In fact, challenges to the long-used practice of “chalking” tires to determine how long a vehicle has been parked have resulted in two appellate level decisions, with one finding this practice to be a violation of Fourth Amendment rights. So, in some cases, the issue may appear to be negligible while still in traffic court, but may have greater constitutional implications once freed of those confines by shrewd lawyering.

Then there are the negative side effects of being represented by an algorithm. While most traffic court dispensations rely on rote recitals by judges and ticketed drivers, a few don’t. And while most courts are willing to grant more leeway to laypeople representing themselves, it seems unlikely (human) judges will do the same when it becomes clear they’re dealing with an AI interloper that (without doing anything) insinuates it’s smarter than the average defendant, not to mention the average judge.

Browder has addressed this possibility of AI reliance being a net negative for this defendant, but he does so a bit too blithely:

The CEO said the company is also working with another U.S.-based speeding ticket defendant in a case that will go to Zoom trial. In that instance, DoNotPay is weighing the use of a teleprompter vs. a synthetic voice—the latter strategy Browder described as “highly illegal.” But he’s not too concerned about legal repercussions because “at the end of the day, it’s a traffic ticket.” Browder doesn’t expect courts to come down hard on speeding defendants over AI-coaching, and the law doesn’t have explicit provisions in it barring AI-legal assistance. Plus, “it’s an experiment and we like to take risks,” he added.

This is not to say AI has no business operating in the legal field. In traffic cases where someone’s driving privileges or freedom is not on the line, AI assistance may help, especially when the person its aiding has no legal expertise.

And lawyers may find AI useful while seeking relevant precedent or composing briefs and contracts, what with AI’s willingness to plumb the depths of legal rulings and corporate boilerplate to find solutions. But it’s unlikely (or, at least, incredibly unwise) people facing serious legal issues, like lawsuits or criminal charges, will rely on AI to get them out of a legal jamb. Good lawyers are good not just because they know the law. They also know the system and, most importantly, the people operating it. An AI can’t easily duplicate personal relationships with opposing counsel. Nor can it easily take advantage of unforced errors by legal opponents.

But in areas where lawyers are seldom retained, and users fully apprised of the limitations of the AI they’re relying on (which may find new, truly surprising ways to fail when navigating untested areas), there’s probably little harm in asking for some help when attempting to save a few bucks by challenging a bogus ticket. For everything else, actual people — as fallible as they can be — are still the best bet.

Filed Under: , , , , ,
Companies: donotpay

Rate this comment as insightful
Rate this comment as funny
You have rated this comment as insightful
You have rated this comment as funny
Flag this comment as abusive/trolling/spam
You have flagged this comment
The first word has already been claimed
The last word has already been claimed
Insightful Lightbulb icon Funny Laughing icon Abusive/trolling/spam Flag icon Insightful badge Lightbulb icon Funny badge Laughing icon Comments icon

Comments on “AI Lawyer Will Represent Client In Traffic Court, Threatening Nonexistent Market For Traffic Court Lawyers”

Subscribe: RSS Leave a comment
21 Comments

This comment has been flagged by the community. Click here to show it.

wjohnson343 (profile) says:

Section 230 allows online harassers to harm victims with no recourse

Nobody except pig tech companies like Section 230. Section 230 puts the corporate profits of immoral tech companies like Tech Dirt, Google, Facebook, above all else and above the safety of victims of online harassment. This is idiotic and no civilized society should tolerate this. Fuck Section 230 Fuck Tech Dirt Fuck supporters of Section 230

wjohnson343 (profile) says:

Re: Re: Tech platforms enable online harassment by keeping illegal content up

and ignoring court orders or proof submitted by the victim. Tech companies don’t care and actively help harassers by making it as difficult as possible for victims to identify the perpetrators or to get personal private information removed.

Don’t even fucken pretend tech companies care about victims. They simply do not. Until SECTION 230 is repealed, tech platforms have no incentive to crack down on illegal or harmful content like doxxing or cyberstalking.

You dumb fuck.

wjohnson343 (profile) says:

Re: Re: Online speech should not include online harassment and doxxing

Online speech should not include online harassment and doxxing. TechDirt cyberlibertarian pigs apparently want “online speech” to be all inclusive, even revenge porn and other illegal shit.

Very vile and immoral group of people – only care about profits and not give a shit about victims or public safety.

bluegrassgeek (profile) says:

What law school did the AI attend?

The biggest issue with this is “Who is actually responsible if the client receives poor legal advice from this program?”

I can guarantee the tech startup won’t claim legal responsibility, they’ll blame it on the program. And the courts aren’t going to recognize the program as a person, much less a lawyer.

Which means the court is treating the defendant as representing themselves. And they’re the one on the hook if the program gives them bad advice.

Tanner Andrews (profile) says:

Re: who gets the blame

Who is actually responsible if the client receives poor legal advice from this program?

As you surmise, the defendant himself. Treat it as self-representation, truly it is that.

In years past, who was to blame when the defendant went to the law library and cracked open the books? He may have gotten good advice and opinions, or he might have gotten his case’s equivalent of Betts v. Brady, 316 U.S. 456 (1942).

More recently, who is to blame if he uses a Westlaw terminal in the law library and gets bad information? Betts is still available on line for those who ask.

And if the defendant manages to ask his smart speaker for legal references, who is to blame? Surely Amazon does not think they are in the law business.

This comment has been flagged by the community. Click here to show it.

wjohnson343 (profile) says:

Time for Section 230 to be repealed

To hold tech companies accountable for facilitating abuse, and keeping users safe from online harassment, doxing, stalking, and other forms of abuse and privacy invasions.

The internet should not be a toilet, to the shock of imbeciles on Tech Dirt who believe in free speech absolutism.

Free Speech is only one right, to be balanced with other rights, like privacy.

The EU and UK understand this. Fucken USA doesn’t.

Christenson says:

People matter *a lot*...story

A decade or so ago, I got out of paying a ticket by showing up in court, and the judge basically said that if I had been represented, he would not have let that happen.

So, yeah, the judge and his triggers and pet peeves and things he cares about matters a lot to getting a good decision.

Unfortunately, this is also now true at the US supreme court, not just small county courts!

Add Your Comment

Your email address will not be published. Required fields are marked *

Have a Techdirt Account? Sign in now. Want one? Register here

Comment Options:

Make this the or (get credits or sign in to see balance) what's this?

What's this?

Techdirt community members with Techdirt Credits can spotlight a comment as either the "First Word" or "Last Word" on a particular comment thread. Credits can be purchased at the Techdirt Insider Shop »

Follow Techdirt

Techdirt Daily Newsletter

Techdirt Deals
Techdirt Insider Discord
The latest chatter on the Techdirt Insider Discord channel...
Loading...