TikTok Sued Again By Parents Whose Children Killed Themselves Participating In A ‘Blackout Challenge’

from the about-as-useful-as-thoughts-and-prayers dept

A couple of months ago, the parents of a 10-year-old who died of asphyxiation while allegedly “participating” in a “blackout challenge” sued TikTok, alleging their child’s death was directly related to the social media platform’s moderation efforts (or lack thereof) and content recommendation algorithms. The suit, filed in a Pennsylvania federal court, claimed the death had everything to do with TikTok’s decision to value profits over user safety. And it attempted to dodge the inevitable Section 230 question by alleging this had nothing to do with the third party content the child had viewed and everything to do with TikTok’s handling of, well, third party content.

A similar lawsuit has just been filed by the families of two children who died under similar circumstances.

Eight-year-old Lalani Erika Walton wanted to become “TikTok famous.” Instead, she wound up dead.

Hers is one of two such tragedies that prompted a linked pair of wrongful death lawsuits filed Friday in Los Angeles County Superior Court against the social media giant. The company’s app fed both Lalani and Arriani Jaileen Arroyo, 9, videos associated with a viral trend called the blackout challenge in which participants attempt to choke themselves into unconsciousness, the cases allege; both of the young girls died after trying to join in.

Unlike the May lawsuit, this one [PDF] has been filed in a California county court. But its allegations are pretty much the same being made in a federal court on the other side of the nation. The causes of action are defective, negligence, failure to warn, and –specific to this case — violations of California consumer protection laws.

What’s not discussed at all is Section 230 of the CDA, something that might be a bit easier to avoid if the plaintiffs can keep the lawsuit in the county court and the judge focused on alleged consumer law violations. But it’s a discussion that’s all but inevitable.

While the plaintiffs in both cases focus on defective design, negligence, and other things allegedly traceable to TikTok’s moderation efforts and content recommendation engine, the unavoidable fact is that the acts instigating the lawsuits were compelled by content posted by other TikTok users. That is a third party content problem. TikTok’s algorithms may have played a part in surfacing harmful content, but its algorithms are nothing without a steady stream of user-generated content and the inputs of TikTok users consuming the generated content.

It’s impossible to sue dozens of TikTok users for posting harmful content. Not only that, but it’s a losing strategy: the tragedies forming the basis for the lawsuits were the actions of individual users, as difficult as that is to accept. Suing TikTok makes only slightly more sense than attempting to hold TikTok users who’ve created harmful content responsible for the self-harm their content provoked. But making slightly more sense doesn’t put the plaintiffs on the path to courtroom victory.

TikTok may have a wealth of content moderation problems. It may be cutting corners in moderation to ensure maximum profitability. It may have discovered — like so many other platforms — that exponential growth creates content moderation problems that are impossible to solve. And it may very well have promoted harmful content to certain users — not in hopes that they’d harm themselves, but in an effort to extend engagement and retain users. But all of this together does not add up to legal culpability.

Again, what I’ve said above is not an attempt to blame the victims or their survivors for these tragic deaths. It’s very easy to say that parents should have been more involved, especially given the ages of the victims here. But children can often be inscrutable black boxes. Sometimes the only way to discover what should have been done is to examine the evidence after the tragedy has already occurred. And while that may provide some guidance going forward, it does not turn the clock back on the tragedy or make the future easier for families who’ve lost young children.

Unfortunately, neither will these lawsuits. And it seems unseemly, at best, for law firms to give grieving parents the false hope the court system can provide some kind of payout, much less closure, by suing social media platforms over the actions of their users.

Filed Under: , , ,
Companies: tiktok

Rate this comment as insightful
Rate this comment as funny
You have rated this comment as insightful
You have rated this comment as funny
Flag this comment as abusive/trolling/spam
You have flagged this comment
The first word has already been claimed
The last word has already been claimed
Insightful Lightbulb icon Funny Laughing icon Abusive/trolling/spam Flag icon Insightful badge Lightbulb icon Funny badge Laughing icon Comments icon

Comments on “TikTok Sued Again By Parents Whose Children Killed Themselves Participating In A ‘Blackout Challenge’”

Subscribe: RSS Leave a comment
20 Comments
Jack the Nonabrasive (profile) says:

Re: Parental Consent issue

The lawsuit actually states that non of the parents consented to their childrens’ use. An example paragraph.

Christina Arlington has not entered into a User Agreement or other contractual relationship with TikTok herein in connection with Lalani Walton’s use of Defendants’ social media product.

That seems to mean that there’s some liability for TikTok there?

Bergman (profile) says:

Re: Re:

The problem is that either the parents failed to utilize parental controls on the kid’s internet access, or they let the kid use the internet unsupervised, or they didn’t do a good job implementing controls, leaving holes.

The kid’s access to TikTok was unauthorized by TikTok, under their terms of service – which is a CFAA violation, as some courts interpret the law. If it’s a felony for the kid to be on TikTok at all, TikTok is probably not at fault.

Anonymous Coward says:

Re: Re:

Which has a limited value in figuring out the new ways kids come up with to injure themselves. Moderation is largely reactive, called into use after the first injuries or deaths,while protecting kids needs a more pro-active approach, and explaining the them that what you just saw them do is dangerous..

This comment has been flagged by the community. Click here to show it.

This comment has been flagged by the community. Click here to show it.

Anonymous Coward says:

Here’s the thing: if TikTok just spent more time and money in moderating, it wouldn’t matter that parents use the Internet as their babysitter. It’s not like a website has no way of telling that a kid put in a date of birth several years before the actual event, and they are certainly aware when that same kid steps away from the computer to choke themselves out for an inane challenge and winds up dead. So this is entirely TikTok’s fault for not having essential knowledge, and absolutely nothing to do with the parents not checking in on little Tiffany every few minutes. /s

Bergman (profile) says:

Re:

You’re new to Techdirt, aren’t you?

Go back into the site archives and look up the dozens of articles about how automated content moderation is the only way to do it, but at the same time automating the job doesn’t work.

The biggest and best funded companies in the world routinely fail spectacularly at automated content moderation, and they can afford the absolute best tools for it. It doesn’t help them surpass the level of a dumpster fire in their content moderation efforts.

TikTok is not the biggest and best funded company in the world by a long shot. It’s really easy for you to SAY they should be able to do this easily, but the reality is that of the first rate companies can’t, there’s no way a third-rate company like TikTok has any chance at all.

That Anonymous Coward (profile) says:

“Sometimes the only way to discover what should have been done is to examine the evidence after the tragedy has already occurred.”

Sometimes the only way to is actually pay attention to the world.

Child dies in backseat in summer.
Pretty sure it shouldn’t have taken more than 1 to make the point but now we’re mandating systems to alert parents their kids are in the backseat.

Child dies after seeing “fill in the blank” online.
Yet parents are still relying on the digital babysitter, & are upset the corporations didn’t care about their kid.

Child gunned down in school.

Perhaps humans are just that stupid.

Add Your Comment

Your email address will not be published. Required fields are marked *

Have a Techdirt Account? Sign in now. Want one? Register here

Comment Options:

Make this the or (get credits or sign in to see balance) what's this?

What's this?

Techdirt community members with Techdirt Credits can spotlight a comment as either the "First Word" or "Last Word" on a particular comment thread. Credits can be purchased at the Techdirt Insider Shop »

Follow Techdirt

Techdirt Daily Newsletter

Techdirt Deals
Techdirt Insider Discord
The latest chatter on the Techdirt Insider Discord channel...
Loading...