Judge Doesn't Find Much To Like In 'Material Support For Terrorism' Lawsuit Against Twitter

from the doubling-down-on-wrong-does-no-one-any-favors dept

The lawsuit against Twitter for “providing material support” to ISIS (predicated on the fact that ISIS members use Twitter to communicate) — filed in January by the widow of a man killed in an ISIS raid — is in trouble.

Twitter filed its motion to dismiss in March, stating logically enough that the plaintiff had offered nothing more than conclusory claims about its “support” of terrorism, not to mention the fact that there was no link between Twitter and the terrorist act that killed the plaintiff’s husband. On top of that, it pointed out the obvious: that Section 230 does not allow service providers to be held responsible for the actions of their users.

As reported by Nicholas Iovino of Courthouse News Service, the presiding judge doesn’t seem too impressed by what he’s seen so far from the plaintiff.

U.S. District Judge William Orrick said the complaint fails to show a link between the social media network’s actions and the attack that took five lives in Jordan.

“I just don’t see causation under the Antiterrorism Act,” [Judge William] Orrick said. “There’s no allegation that ISIS used Twitter to recruit Zaid.”

That deals a blow to one of the lawsuit’s allegations. Orrick also didn’t find the plaintiff’s claim that Twitter direct messages are somehow different than regular tweets when it comes to Section 230 protections.

Orrick was not persuaded that companies like Twitter could be sued for messages sent by users.

“Just because it’s private messaging doesn’t put this beyond the Communications Decency Act’s reach,” Orrick said.

This was in response to the plaintiff’s lawyer’s assertion that because direct messages are not accessible by the public, Twitter couldn’t avail itself of Section 230 protections as a “publisher.” Twitter’s lawyer countered by pointing out email providers are still considered “publishers” and they can’t be held responsible for users’ communications, even though those messages are never made public.

It only took about 40 minutes for Judge Orrick to reach a decision, albeit one that doesn’t shut down this ridiculous lawsuit completely. The lawsuit has been dismissed, but without prejudice and with an invitation for the plaintiff to file an amended complaint.

Given the hurdles the plaintiff needs to leap (some logical, some statutory) to find Twitter responsible for the actions of terrorists halfway around the world, it’s unlikely that an amended complaint will fix the seriously misguided lawsuit. The only people truly responsible for the plaintiff’s husband’s death are those who took his life. While it’s an understandable emotional response to want someone to pay for the murder of a loved one, sometimes there’s no way to receive that sort of closure.

Twitter isn’t a closed platform developed solely for terrorists’ communications. It’s available to anyone with an email address… even terrorists. Twitter is routinely criticized for its handling of illicit material and abusive behavior, but the undeniable fact still remains: these unpleasant communications are created by users, not by Twitter. Any attempt to connect the dots between a terrorist attack and terrorist chatter is tenuous, and any attempt to hold platforms responsible for the actions of their users carries with it the potential to make the internet worse for millions of law-abiding users.

Filed Under: , , , ,
Companies: twitter

Rate this comment as insightful
Rate this comment as funny
You have rated this comment as insightful
You have rated this comment as funny
Flag this comment as abusive/trolling/spam
You have flagged this comment
The first word has already been claimed
The last word has already been claimed
Insightful Lightbulb icon Funny Laughing icon Abusive/trolling/spam Flag icon Insightful badge Lightbulb icon Funny badge Laughing icon Comments icon

Comments on “Judge Doesn't Find Much To Like In 'Material Support For Terrorism' Lawsuit Against Twitter”

Subscribe: RSS Leave a comment
12 Comments
Annonimus says:

Bulshit

“any attempt to hold platforms responsible for the actions of their users carries with it the potential to make the internet worse for millions of law-abiding users”

No. Just no. That statement isn’t correct. Most attempts to hold platforms responsible for the actions of their user will carry the potential to make the internet worse for millions of law-abiding users. This is because most attempts to hold platforms responsible for the actions of their users don’t actually care if the platforms are responsible for the actions of their users. They just want methods of control on the discussions of ordinary people to be put on the internet platform.

It’s one thing if a terrorist uses a private (or not so private) method of communication on a platform to plan and/or execute their attacks. It’s another thing when the platform itself is one of the driving mechanisms for the radicalization of a person into a terrorist.

In this case Twitter’s private chat was used by terrorists for some of their communication, but Twitter did not at any point do anything to encourage the radicalization of any of those people into terrorists and therefore shouldn’t be held responsible for their actions.

Anonymous Coward says:

Re: Bulshit

While I don’t believe Twitter is ” ‘providing material support’ to ISIS”, as that would have to be a conscious act on their part, what exactly ARE they doing to prevent their platform from being used as an ISIS recruitment tool or a covert means for ISIS communication?

Twitter has no problem banning a gay conservative, Milo Yiannopoulos, for his speech on Twitter, repeatedly. Twitter openly censors speech that disagrees with its policies and has that right as the site owner. But how much effort does it put into banning speech from ISIS and other Islam affiliated groups that violates Twitter policy?

Failure to police yourself with respect to ISIS and related groups, especially when you have shown a willingness to police certain individuals and other groups, repeatedly, opens Twitter up to the perception that they provide a “wink and a nod” approval of these activities, despite Twitter’s own policies. Twitter should start moderating all accounts equally or just stop moderating. Moderating based on some internal bias will open them up to charges like this.

Wendy Cockcroft (user link) says:

Re: Re: Bulshit

How exactly could ANY platform “prevent” ISIS from using it? Okay, I’ll bite.

They’d have to hire staff to comb through every message that tripped the keyword wire, i.e. usage of words like “Bomb” or “martyr.” Now imagine how hard that would be to implement. I mean, I bought a jacket recently that cost a bomb and I’m a martyr to my leg cramps. Okay, so everyday speech would have to be examined to see if it was terroristic or not. Assume a freakin’ huge PILE of tweets to sift through; what other filters would you use? Arabic-sounding names? What would you do to stem the torrent of tweets to make it easier to search them — pause the or message until a moderator has checked to make sure you’re not a terrorist?

This is vastly impractical, I don’t believe you’ve thought your question through, AC.

Anonymous Coward says:

We don't jail car manufacturers...

Just immagine if we jailed some Ford, GMC, etc. employees whenever a car made by them was involved in a crime. Where would liability end? Should we also jail the individual parts providers?

Ah hell! Why stop there? We could also go after the guys who paved the road the perps drove on through to get away! /facepalm

Anonymous Coward says:

Re: We don't jail car manufacturers...

This is exactly what anti-gunners are trying to do to gun manufacturers. They are trying to find a way to sue them. If they are successful, look for the same logic to be applied to cars, alcohol and ultimately the internet. Whatever anyone things of guns and gun manufactures, it is an extremely slippery slope to open them up to lawsuits.

John85851 (profile) says:

Re: Re: We don't jail car manufacturers...

Actually, law makers should hold gun manufacturers to the same safety standards as car makers.
By law, cars must have seat belts, anti-lock brakes, air bags, and other safety features.

By comparison, guns come with a palm-reader so only the owner can fire it, an RFID chip so it can only be used within a certain range of the fob (again, to prevent stolen guns to be used in crimes), and guns are limited to only firing 60 bullets per minute. Oh, wait, NONE of that is happening.

Guns are the only products on the market that are designed to kill, and which don’t have any improvement in safety features.

John Fenderson (profile) says:

Re: Re: Re: We don't jail car manufacturers...

I agree with this, but have two observations:

1) Right now, if a gun manufacturer produces a weapon that is actually defective in some way that causes death or injury, they do have (and should have) liability for that.

2) Additional safety standards (as with cars) seem like a good idea — but it’s wrong and dangerous to try to legally hold manufacturers to standards that don’t yet actually exist. First things first.

Add Your Comment

Your email address will not be published. Required fields are marked *

Have a Techdirt Account? Sign in now. Want one? Register here

Comment Options:

Make this the or (get credits or sign in to see balance) what's this?

What's this?

Techdirt community members with Techdirt Credits can spotlight a comment as either the "First Word" or "Last Word" on a particular comment thread. Credits can be purchased at the Techdirt Insider Shop »

Follow Techdirt

Techdirt Daily Newsletter

Ctrl-Alt-Speech

A weekly news podcast from
Mike Masnick & Ben Whitelaw

Subscribe now to Ctrl-Alt-Speech »
Techdirt Deals
Techdirt Insider Discord
The latest chatter on the Techdirt Insider Discord channel...
Loading...