TikTok Sued Again By Parents Whose Children Killed Themselves Participating In A ‘Blackout Challenge’
from the about-as-useful-as-thoughts-and-prayers dept
A couple of months ago, the parents of a 10-year-old who died of asphyxiation while allegedly “participating” in a “blackout challenge” sued TikTok, alleging their child’s death was directly related to the social media platform’s moderation efforts (or lack thereof) and content recommendation algorithms. The suit, filed in a Pennsylvania federal court, claimed the death had everything to do with TikTok’s decision to value profits over user safety. And it attempted to dodge the inevitable Section 230 question by alleging this had nothing to do with the third party content the child had viewed and everything to do with TikTok’s handling of, well, third party content.
A similar lawsuit has just been filed by the families of two children who died under similar circumstances.
Eight-year-old Lalani Erika Walton wanted to become “TikTok famous.” Instead, she wound up dead.
Hers is one of two such tragedies that prompted a linked pair of wrongful death lawsuits filed Friday in Los Angeles County Superior Court against the social media giant. The company’s app fed both Lalani and Arriani Jaileen Arroyo, 9, videos associated with a viral trend called the blackout challenge in which participants attempt to choke themselves into unconsciousness, the cases allege; both of the young girls died after trying to join in.
Unlike the May lawsuit, this one [PDF] has been filed in a California county court. But its allegations are pretty much the same being made in a federal court on the other side of the nation. The causes of action are defective, negligence, failure to warn, and –specific to this case — violations of California consumer protection laws.
What’s not discussed at all is Section 230 of the CDA, something that might be a bit easier to avoid if the plaintiffs can keep the lawsuit in the county court and the judge focused on alleged consumer law violations. But it’s a discussion that’s all but inevitable.
While the plaintiffs in both cases focus on defective design, negligence, and other things allegedly traceable to TikTok’s moderation efforts and content recommendation engine, the unavoidable fact is that the acts instigating the lawsuits were compelled by content posted by other TikTok users. That is a third party content problem. TikTok’s algorithms may have played a part in surfacing harmful content, but its algorithms are nothing without a steady stream of user-generated content and the inputs of TikTok users consuming the generated content.
It’s impossible to sue dozens of TikTok users for posting harmful content. Not only that, but it’s a losing strategy: the tragedies forming the basis for the lawsuits were the actions of individual users, as difficult as that is to accept. Suing TikTok makes only slightly more sense than attempting to hold TikTok users who’ve created harmful content responsible for the self-harm their content provoked. But making slightly more sense doesn’t put the plaintiffs on the path to courtroom victory.
TikTok may have a wealth of content moderation problems. It may be cutting corners in moderation to ensure maximum profitability. It may have discovered — like so many other platforms — that exponential growth creates content moderation problems that are impossible to solve. And it may very well have promoted harmful content to certain users — not in hopes that they’d harm themselves, but in an effort to extend engagement and retain users. But all of this together does not add up to legal culpability.
Again, what I’ve said above is not an attempt to blame the victims or their survivors for these tragic deaths. It’s very easy to say that parents should have been more involved, especially given the ages of the victims here. But children can often be inscrutable black boxes. Sometimes the only way to discover what should have been done is to examine the evidence after the tragedy has already occurred. And while that may provide some guidance going forward, it does not turn the clock back on the tragedy or make the future easier for families who’ve lost young children.
Unfortunately, neither will these lawsuits. And it seems unseemly, at best, for law firms to give grieving parents the false hope the court system can provide some kind of payout, much less closure, by suing social media platforms over the actions of their users.
Filed Under: children, moral panic, section 230, tiktok challenges
Companies: tiktok
Comments on “TikTok Sued Again By Parents Whose Children Killed Themselves Participating In A ‘Blackout Challenge’”
Given that TikTok was the subject of a settlement having to do with obtaining parental consent, I’m surprised that wasn’t raised here. Is there another COPPA aspect to this case?
Re:
You are right bro.
Re: Parental Consent issue
The lawsuit actually states that non of the parents consented to their childrens’ use. An example paragraph.
That seems to mean that there’s some liability for TikTok there?
Re: Re:
If that is the case, then the child was using TikTok in violation of the Terms of Service, and shouldn’t have been there watching stupid challenge videos in the first place. Not TikTok’s fault.
Re: Re:
The problem is that either the parents failed to utilize parental controls on the kid’s internet access, or they let the kid use the internet unsupervised, or they didn’t do a good job implementing controls, leaving holes.
The kid’s access to TikTok was unauthorized by TikTok, under their terms of service – which is a CFAA violation, as some courts interpret the law. If it’s a felony for the kid to be on TikTok at all, TikTok is probably not at fault.
Parents fail to keep tabs on what their children are doing, and accuse Tik-Tok of doing the same, while ignoring that Tik-Tok has far more kids per adult to monitor.
Re:
TikTok has the benefit of an engineering and moderation team in the thousands, as well as state-of-the-art machine learning and moderation technology.
Re: Re:
Which has a limited value in figuring out the new ways kids come up with to injure themselves. Moderation is largely reactive, called into use after the first injuries or deaths,while protecting kids needs a more pro-active approach, and explaining the them that what you just saw them do is dangerous..
Re: Re:
TikTok also lacks a duty of care, and at the same time is immune to lawsuits of this type.
Another day, another scammer looking to illegally obtain a payday.
Re:
None of this is illegal.
Re: Re:
In the sense that the scammers will face jail time for their conduct: no.
In the sense that the federal law they’re violating should shut this fraud down and leave them out their time and money: yes.
its the Tiktok “cut your own head off with a chainsaw” challenge.
This comment has been flagged by the community. Click here to show it.
sexe en normandie
Enjoy free chat with hot ladies from France only on our web platform sexe en normandie
This comment has been flagged by the community. Click here to show it.
Here’s the thing: if TikTok just spent more time and money in moderating, it wouldn’t matter that parents use the Internet as their babysitter. It’s not like a website has no way of telling that a kid put in a date of birth several years before the actual event, and they are certainly aware when that same kid steps away from the computer to choke themselves out for an inane challenge and winds up dead. So this is entirely TikTok’s fault for not having essential knowledge, and absolutely nothing to do with the parents not checking in on little Tiffany every few minutes. /s
Re:
You’re new to Techdirt, aren’t you?
Go back into the site archives and look up the dozens of articles about how automated content moderation is the only way to do it, but at the same time automating the job doesn’t work.
The biggest and best funded companies in the world routinely fail spectacularly at automated content moderation, and they can afford the absolute best tools for it. It doesn’t help them surpass the level of a dumpster fire in their content moderation efforts.
TikTok is not the biggest and best funded company in the world by a long shot. It’s really easy for you to SAY they should be able to do this easily, but the reality is that of the first rate companies can’t, there’s no way a third-rate company like TikTok has any chance at all.
Re: Re:
I see that a great amount of people missed the sarc mark that was put at the end of that nevertheless obviously sarcastic comment, but you’re the only one stupid enough to double down on your ignorance.
Re: Re: Re:
Agreed. If everyone who flagged that post had bothered to read it to the end, they’d have seen where the other AC highlighted the importance of parental supervision.
Re: Re: Re:
This just turns Poe’s Law on its head, showing that even with a clear indicator of the author’s intent, there are some people who will be taken in by a satirical post.
“Sometimes the only way to discover what should have been done is to examine the evidence after the tragedy has already occurred.”
Sometimes the only way to is actually pay attention to the world.
Child dies in backseat in summer.
Pretty sure it shouldn’t have taken more than 1 to make the point but now we’re mandating systems to alert parents their kids are in the backseat.
Child dies after seeing “fill in the blank” online.
Yet parents are still relying on the digital babysitter, & are upset the corporations didn’t care about their kid.
Child gunned down in school.
…
Perhaps humans are just that stupid.