Yet Another Lawsuit Hopes A Court Will Hold Twitter Responsible For Terrorists' Actions

from the law-firms-basically-setting-up-franchises dept

So, this is how we’re handling the War on Terror here on the homefront: lawsuit after lawsuit after lawsuit against social media platforms because terrorists also like to tweet and post stuff on Facebook.

The same law firm (New York’s Berkman Law Office) that brought us last July’s lawsuit against Facebook (because terrorist organization Hamas also uses Facebook) is now bringing one against Twitter because ISIS uses Twitter. (h/t Lawfare’s Ben Wittes)

Behind the law firm are more families of victims of terrorist attacks — this time those in Brussels and Paris. Once again, any criticism of this lawsuit (and others of its type) is not an attack on those who have lost loved ones to horrific acts of violence perpetrated by terrorist organizations.

The criticisms here are the same as they have been in any previous case: the lawsuits are useless and potentially dangerous. They attempt to hold social media platforms accountable for the actions of terrorists. At the heart of every sued company’s defense is Section 230 of the CDA, which immunizes them against civil lawsuits predicated on the actions and words of the platform’s users.

The lawsuits should be doomed to fail, but there’s always a chance a judge will construe the plaintiffs’ arguments in a way that either circumvents this built-in protection or, worse, issues a precedential ruling carving a hole in these protections.

The arguments here are identical to the other lawsuits: Twitter allegedly hasn’t done enough to prevent terrorists from using its platform. Therefore, Twitter (somehow) provides material support to terrorists by not shutting down (one of) their means of communication (fast enough).

The filing [PDF] is long, containing a rather detailed history of the rise of the Islamic State, a full rundown of the attacks in Brussels and Paris, and numerous examples of social media posts by terrorists. It’s rather light on legal arguments, but then it has to be, because the lawsuit works better when it tugs at the heartstrings, rather than addressing the legal issues head on.

The lawsuit even takes time to portray Twitter’s shutdown of Dataminr’s feed to US government surveillance agencies — as well as its policy of notifying users of government/law enforcement demands for personal information — as evidence of its negligence, if not outright support, of terrorist groups.

The problem with these lawsuits — even without the Section 230 hurdle — is that the only way for Twitter, Facebook, etc. to avoid being accused of “material support” for terrorism is to somehow predetermine what is or isn’t terrorist-related before it’s posted… or even before accounts are created. To do otherwise is to fail. Any content posted can immediately be reposted by supporters and detractors alike.

And that’s another issue that isn’t easily sorted out by platforms with hundreds of millions of users. Posts and tweets are just as often passed on by people who don’t agree with content, but arguments made in these lawsuits expect social media platforms to determine what intent is… and take action almost immediately. Any post or account that stays “live” for too long becomes a liability, should courts find in favor of these plaintiffs. It’s an impossible standard to meet.

These lawsuits ask courts to shoot the medium, rather than the messenger. They make about as much sense as suing cell phone manufacturers because they’re not doing enough to prevent terrorists from buying their phones and using them to communicate.

Filed Under: , , , , , ,
Companies: twitter

Rate this comment as insightful
Rate this comment as funny
You have rated this comment as insightful
You have rated this comment as funny
Flag this comment as abusive/trolling/spam
You have flagged this comment
The first word has already been claimed
The last word has already been claimed
Insightful Lightbulb icon Funny Laughing icon Abusive/trolling/spam Flag icon Insightful badge Lightbulb icon Funny badge Laughing icon Comments icon

Comments on “Yet Another Lawsuit Hopes A Court Will Hold Twitter Responsible For Terrorists' Actions”

Subscribe: RSS Leave a comment
26 Comments
orbitalinsertion (profile) says:

Re: Re: Re:

Or the atmosphere, just for carrying sound waves. And the countries the terrorists are from, or live in, or acted in (oops). And any phone company, anyone with a public WAP that may have been used, ISP’s, and the terrorists’ parents. And anyone who ever may have walked past one without stopping them ahead of time.

Seriously the stupidest thing about this is that they want to go after public spaces on the net where everyone else can see what they are doing, even if they (or the platform) can’t tell it is part of an evil plot up front. And they don’t seem to want to litigate against any medium where they can’t see the presence of the criminals who did the attack (or whatever). Not that they should, but there is something deeply infantile about the unrecognized "distinction" these people are drawing here*.

*Some are. I am sure plenty get swept into it via rhetoric and emotional manipulation as well. Still i don’t know how anyone can see it as a sensible idea.

Anonymous Coward says:

Let's Compare

Modern terrorism:
Thousands dead in scattered activity.

20th Century governments supported in part by censorship, propaganda, and stifling of dissent: 100 Million+, (actually, we’ll never really know,) in organized murder with no recourse.

Inexplicably, I find myself far more concerned about the long-term damage of censorious case law and legislation, coupled with ‘for the children’ law enforcement zeal than whether or not Johnny Blow-up can tweet.

AEIO_ (profile) says:

Re: And there's a much more legitimate argument against the gun manufacturers.

Gun control means hitting your target.

Lawsuits against gun manufacturers should come when a gun malfunctions, injuring or killing the operator. Other than that: NO.

“there’s a much more legitimate argument”? You mean
* legally (an active lawsuit somewhere),
* morally (Don’t be naughty! — wait, that’s Google), or
* you just don’t like the current situation?

I’ve got a gun (oh the HORROR!) by my nightstand. It’s not loaded but I treat is as such, and in 2 seconds it COULD be. (No kids visit.) I take it outside going back in the woods, and when weird things are happening 1/4 of a mile around my house. My neighbors? The nearest is 1/4 of a mile away, the next few are 1/2 of a mile. If anything happens I’m on my own unless I can use the phone for help (and they’re there INSTANTLY! For 15 minutes of instant if I’m lucky) or someone happens to drive along the road and notices odd things / bothers to call.

That One Guy (profile) says:

Re: Re: And there's a much more legitimate argument against the gun manufacturers.

I think you might be taking their comment the wrong way. As I read it they’re basically saying ‘If Twitter can be held personally responsible for those that use their site, then by that logic gun manufacturers/sellers should be liable for those that use their products’ without necessarily agreeing with that logic themselves.

If, when someone uses a gun to commit a crime they rather than the one who made/sold the gun are held responsible, then likewise blame the murderous losers for their actions, don’t blame the site they use.

Wendy Cockcroft (user link) says:

Re: Re: And there's a much more legitimate argument against the gun manufacturers.

@ AEIO_ I’m not a fan of guns but you seem like the kind of person I’d have no problem with where they are concerned.

Gun control advocates aren’t interested in sensible folks like you; we just don’t want criminals and loons getting hold of them too easily.

Anonymous Coward says:

Re: Go for it

Twitter is free to remove anyone from the service who breaks the site’s terms and conditions. The admins cannot pre-emptively know whether a user is going to break those terms. We do not exist in the world of “Minority Report”, no matter how much you think tech companies can pull off that kind of miracle.

Richard (profile) says:

Re: Re: Go for it

Twitter is free to remove anyone from the service who breaks the site’s terms and conditions.

Fact is that both FB and twitter seem to be rather better at removing innocuous things (like pictures of statues and girls eating ice cream), or even people arguing against Islamism than they are at removing the propaganda of actual terrorists.

It would be better for them if they removed rather less – thus avoiding giving the impression that they are better able to remove stuff than they actually are. I’m sure that this impression is one of the things that motivates these lawsuits.

Richard (profile) says:

Re: Re: death to white supremacists

and just how many of these terrorists groups have been spawned by the US and it’s allies?

Exactly none of them.

What the US did do was to finance them (mostly by buying oil from Saudi etc) and remove the secular dictators that stood in their way.

But spawning them – no – that goes back to well before the US was even thought of – nearly 1400 years to be more precise.

Wendy Cockcroft (user link) says:

Re: Re: Re:2 death to white supremacists

That is where the modern terrorist movements spring from. ISIS is the product of neocon policies in Iraq. It was born in one of the major prisons because the terrorists were permitted to associate while exercising. So they associated. And when they were released, they started up again; same dance, different tune.

ISIS is a different animal from the others, though. It’s about gaining and holding territory, then starting WW3. For this reason it’s impossible to negotiate with them. That said, a lot of members are becoming disillusioned with them as life in the “Caliphate” is not the paradise they expected it to be. Imagine living in a concentration camp run by religious extremists who kill and maim people who don’t toe their line – that’s what it’s like there.

They recruit by offering misfits a chance to take part in a glorious revolution and filling their heads with romantic notions of paradise on earth. It’s heady stuff for people who feel like their lives are going nowhere.

They use social media, to be sure, but that’s just a communication tool. It’s the grooming that’s the problem. You won’t get rid of or even reduce the effectiveness of ISIS by driving them off social media. And their use of social media makes them as visible to their enemies as to potential recruits.

“My $loved_one died, give me $$$$$!” is what is going on with these lawsuits, and while it seems mean to bash grieving families someone needs to tell them that litigation against platforms like Twitter is ultimately worthless in terms of either stopping ISIS or making their lives better.

John85851 (profile) says:

Section 230 of the CDA

At what point are we going to hold lawyers responsible? I see two options:
1) They’re not aware of section 230 of the CDA which is why they think Twitter is responsible.
2) They *are* aware of section 230 of CDA and they’re filing the lawsuit against Twitter, hoping they can settle out of court and make some money.

Both of these options should be grounds for disbarment.

Leave a Reply to That One Guy Cancel reply

Your email address will not be published. Required fields are marked *

Have a Techdirt Account? Sign in now. Want one? Register here

Comment Options:

Make this the or (get credits or sign in to see balance) what's this?

What's this?

Techdirt community members with Techdirt Credits can spotlight a comment as either the "First Word" or "Last Word" on a particular comment thread. Credits can be purchased at the Techdirt Insider Shop »

Follow Techdirt

Techdirt Daily Newsletter

Ctrl-Alt-Speech

A weekly news podcast from
Mike Masnick & Ben Whitelaw

Subscribe now to Ctrl-Alt-Speech »
Techdirt Deals
Techdirt Insider Discord
The latest chatter on the Techdirt Insider Discord channel...
Loading...