Court (Again) Tosses Lawsuit Seeking To Hold Twitter Accountable For ISIS Terrorism

from the that's-not-how-causation-works,-never-mind-Section-230... dept

At the beginning of this year, Tamara Fields -- whose husband was killed by ISIS terrorists -- sued Twitter for "providing material support" to the terrorist group. The actions underlying Fields' lawsuit were undeniably horrific and tragic, but by no means provided any sort of legal basis for holding Twitter responsible for actions or speech undertaken by users of its service.

The lawsuit was dismissed in August, with the court pointing to Twitter's Section 230 immunity and the lawsuit's general lack of argumentative coherence. Perhaps recognizing that Section 230 (and common sense) would prevent Twitter from being held responsible for ISIS's terrorist activities, Fields chose to approach the lawsuit from some novel angles. At some points, Twitter "provided material support" by allowing ISIS members to obtain accounts. At other points, it was Twitter's inability to stop the spread of ISIS propaganda that was the issue.

The court invited Fields to file an amended complaint, hoping to obtain a coherent argument it could address with equal clarity. It didn't get it. The amended complaint may be a bit more structured, but the court has again dismissed the lawsuit [PDF] on Section 230 grounds while also addressing the deficiencies of other arguments raised by Fields. (h/t Eric Goldman)

Fields tries to drill down the "provision of accounts" theory: that the ability of ISIS terrorists to obtain accounts somehow amounts to "material support" -- or, in any case, should result in the removal of Twitter's Section 230 immunity. The court says this argument makes no sense and, in fact, invites the court to engage in restriction of First Amendment-protected activity.

Plaintiffs’ provision of accounts theory is slightly different, in that it is based on Twitter’s decisions about whether particular third parties may have Twitter accounts, as opposed to what particular third-party content may be posted. Plaintiffs urge that Twitter’s decision to provide ISIS with Twitter accounts is not barred by section 230(c)(1) because a “content-neutral decision about whether to provide someone with a tool is not publishing activity.”

The court disagrees. There's no way Twitter can act in a "content-neutral" manner and still deny accounts to ISIS members like the plaintiff believes it should. The only way to discover whether an account holder might be a terrorist or terrorist sympathizer is by examining the content they post.

Although plaintiffs assert that the decision to provide an account to or withhold an account from ISIS is “content-neutral,” they offer no explanation for why this is so and I do not see how this is the case. A policy that selectively prohibits ISIS members from opening accounts would necessarily be content based as Twitter could not possibly identify ISIS members without analyzing some speech, idea or content expressed by the would-be account holder: i.e. “I am associated with ISIS.” The decision to furnish accounts would be content-neutral if Twitter made no attempt to distinguish between users based on content – for example if they prohibited everyone from obtaining an account, or they prohibited every fifth person from obtaining an account. But plaintiffs do not assert that Twitter should shut down its entire site or impose an arbitrary, content-neutral policy. Instead, they ask Twitter to specifically prohibit ISIS members and affiliates from acquiring accounts – a policy that necessarily targets the content, ideas, and affiliations of particular account holders. There is nothing content-neutral about such a policy.

The plaintiff, despite amending her complaint, still takes a cake-and-eat-it-too approach when trying to twist ISIS terrorism into a Twitter-enabled activity. The court notes that the new complaint tries to push Twitter's provision of accounts to terrorists as the linchpin of her case, but still spends far more time complaining about Twitter's alleged moderation failures.

As discussed above, the decision to furnish an account, or prohibit a particular user from obtaining an account, is itself publishing activity. Further, while plaintiffs urge me to focus exclusively on those five short paragraphs, I cannot ignore that the majority of the SAC still focuses on ISIS’s objectionable use of Twitter and Twitter’s failure to prevent ISIS from using the site, not its failure to prevent ISIS from obtaining accounts. For example, plaintiffs spend almost nine pages, more than half of the complaint, explaining that “Twitter Knew That ISIS Was Using Its Social Network But Did Nothing”; “ISIS Used Twitter to Recruit New Members”; “ISIS Used Twitter to Fundraise”; and “ISIS Used Twitter To Spread Propaganda.” These sections are riddled with detailed descriptions of ISIS-related messages, images, and videos disseminated through Twitter and the harms allegedly caused by the dissemination of that content.

[...]

It is no surprise that plaintiffs have struggled to excise their content-based allegations; their claims are inherently tied up with ISIS’s objectionable use of Twitter, not its mere acquisition of accounts. Though plaintiffs allege that Twitter should not have provided accounts to ISIS, the unspoken end to that allegation is the rationale behind it: namely, that Twitter should not have provided accounts to ISIS because ISIS would and has used those accounts to post objectionable content.

Because of Fields' inability to raise one (possibly) Section 230-dodging argument (provision of accounts) without relying heavily on one that specifically invokes Twitter's immunity, the lawsuit is doomed to fail no matter how many times the complaint is rewritten or how many levels up it's appealed.

In short, the theory of liability alleged in the [complaint] is not that Twitter provides material support to ISIS by providing it with Twitter accounts, but that Twitter does so by allowing ISIS to use Twitter “to send its propaganda and messaging out to the world and to draw in people vulnerable to radicalization.” SAC ¶ 41. Plaintiffs do not dispute that this theory seeks to treat Twitter as a publisher and is barred by section 230(c)(1).

Furthermore, there is nothing at all connecting Twitter to the murders committed by terrorists.

Even under plaintiffs’ proposed “substantial factor” test, see Oppo. at 11, the allegations in the SAC do not support a plausible inference of proximate causation between Twitter’s provision of accounts to ISIS and the deaths of Fields and Creach. Plaintiffs allege no connection between the shooter, Abu Zaid, and Twitter. There are no facts indicating that Abu Zaid’s attack was in any way impacted, helped by, or the result of ISIS’s presence on the social network. Instead they insist they have adequately pleaded proximate causation because they have alleged “(1) that Twitter provided fungible material support to ISIS, and (2) that ISIS was responsible for the attack in which Lloyd Fields, Jr. and James Damon Creach were killed.” Id. at 13. Under such an expansive proximate cause theory, any plaintiff could hold Twitter liable for any ISIS-related injury without alleging any connection between a particular terrorist act and Twitter’s provision of accounts. And, since plaintiffs allege that Twitter has already provided ISIS with material support, Twitter’s liability would theoretically persist indefinitely and attach to any and all future ISIS attacks. Such a standard cannot be and is not the law.

No doubt this decision will be appealed but it's unlikely to find a court willing to cede as much ground on Section 230 as Fields would like it to, even with the series of bad Section 230-related decisions that have recently plagued the California court system.

Filed Under: isis, material support, material support for terrorism, section 230, tamara fields, terrorism
Companies: twitter


Reader Comments

The First Word

Subscribe: RSS

View by: Time | Thread


  1. identicon
    Anonymous Coward, 23 Nov 2016 @ 9:43am

    A policy that selectively prohibits ISIS members from opening accounts would necessarily be content based as Twitter could not possibly identify ISIS members without analyzing some speech, idea or content expressed by the would-be account holder: i.e. “I am associated with ISIS.”

    On the other hand, almost every website DOES have a policy prohibiting young children from opening accounts, due to COPPA. It kind of seems strange to have a policy that children cannot open accounts but terrorists can.


Add Your Comment

Have a Techdirt Account? Sign in now. Want one? Register here



Subscribe to the Techdirt Daily newsletter




Comment Options:

  • Use markdown. Use plain text.
  • Remember name/email/url (set a cookie)

Follow Techdirt
Special Affiliate Offer

Advertisement
Report this ad  |  Hide Techdirt ads
Essential Reading
Techdirt Deals
Report this ad  |  Hide Techdirt ads
Techdirt Insider Chat
Advertisement
Report this ad  |  Hide Techdirt ads
Recent Stories
Advertisement
Report this ad  |  Hide Techdirt ads

Close

Email This

This feature is only available to registered users. Register or sign in to use it.