Lawsuit Attempts To Hold TikTok Responsible For Death Of 10-Year-Old Girl Who Engaged In A ‘Blackout Challenge’
from the challenge-that-is-neither-new-nor-limited-to-TikTok dept
When a tragedy happens, lawsuits tend to follow. This is no exception. And while it’s understandable that grieving survivors often seek justice — whether it’s closure or compensation — through the legal system, the legal system is not there to provide solace. It’s there to determine whether anyone was legally culpable for the death.
This lawsuit, brought to us by Adam Klasfeld at Law & Crime, is another misguided attempt to hold a social media service directly responsible for a person’s actions. In this case, it’s the death of a 10-year-old TikTok user who allegedly participated in a “blackout challenge” that began with her asphyxiation in her bedroom and ended five days later when she passed away at a local hospital.
But TikTok isn’t responsible for this death. And the lawsuit [PDF] has very little chance of piercing TikTok’s Section 230 immunity, even if it’s deliberately crafted to avoid discussion of third party liability.
The lawsuit opens by making it seem as though this is a problem unique to TikTok, which would make it unquestionably directly responsible for this child’s death. (Emphasis in the original.)
The viral and deadly TikTok Blackout Challenge was thrust in front of Nylah on her TikTok For You Page (“FYP”) as a result of TikTok’s algorithm which, according to the TikTok Defendants, is “a recommendation system that delivers content to each user that is likely to be of interest to that particular user…each person’s feed is unique and tailored to that specific individual.”
The TikTok Defendants’ algorithm determined that the deadly Blackout Challenge was well-tailored and likely to be of interest to 10-year-old Nylah Anderson, and she died as a result.
The TikTok Defendants’ app and algorithm are intentionally designed to maximize user engagement and dependence and powerfully encourage children to engage in a repetitive and dopamine-driven feedback loop by watching, sharing, and attempting viral challenges and other videos. TikTok is programming children for the sake of corporate profits and promoting addiction.
Social media algorithms are definitely designed to promote engagement. That can’t be argued. But if the algorithm suggested the “blackout challenge” to Nylah Anderson, that suggestion is traceable to Anderson’s interactions with the service, as well as other users’ actions, which supposedly made the challenge “viral” (to use Law & Crime’s headline wording) and more likely to surface as a suggestion to TikTok users.
What this doesn’t indicate is that TikTok saw Anderson’s account and decided — without other contributing factors — to recommend she participate in a deadly “challenge.”
TikTok has long been accused of promoting “viral” challenges that can result in injuries or death. Most of the virality is due to hyperbolic, breathless coverage of something someone saw on the internet, rather than actually based on trending videos hosted by the platform. What happened here isn’t a malicious or negligent act by TikTok. If the allegations are true, the algorithm surfaced something that was trending. It did not target a 10-year-old with a deadly challenge.
Whatever surfaced in the “For You” section is shielded by Section 230 and otherwise protected by the First Amendment. The plaintiff and her legal reps want to dodge this inevitable discussion by framing this as a defective product complaint. And they disingenuously pretend this has nothing to do with other TikTok users who contribute to algorithmic promotion of trending content.
Plaintiff does not seek to hold the TikTok Defendants liable as the speaker or publisher of third-party content and instead intends to hold the TikTok Defendants responsible for their own independent conduct as the designers, programmers, manufacturers, sellers, and/or distributors of their dangerously defective social media products and for their own independent acts of negligence as further described herein. Thus, Plaintiffs claims fall outside of any potential protections afforded by Section 230(c) of the Communications Decency Act.
Sure, you can assert that in a federal complaint. But that doesn’t mean judges are obligated to pretend the allegations have no nexus with Section 230.
The details of TikTok’s recommendation system are cited as being a contributor to this death and an indicator that the app’s creators are distributing an intentionally flawed product that values profits above user safety. And, while it’s certainly true profitability and user engagement are more of a concern than the distribution of potentially harmful content, TikTok gives users the tools to curate their feed, as well as information as to how their algorithmic recommendations are compiled. It’s not a complete black box and it indicates TikTok is at least making some effort to limit exposure to harmful content.
The lawsuit is crafted to avoid Section 230 discussions but there’s simply no way to avoid them when you’re dealing with algorithmic recommendations that rely heavily on third party content and user’s own interactions with the service. Beyond that, there’s the First Amendment, which allows social media platforms to decide what content it promotes and which content it chooses to ban or hide.
The long list of supposedly “viral” challenges that have gone “viral” on TikTok included in the lawsuit is meaningless. Even this challenge (whose virality is, at best, disputed) is not something unique to TikTok. It dates back to at least 2008. It apparently resurfaced in 2014, where it was again granted news coverage and resulted in doctors being asked to explain why choking yourself might be dangerous. Late last year, it reappeared like a cicada/horror movie villain to deliver more media consternation and more obvious statements that any challenge that utilizes asphyxiation is harmful.
This is not to blame the victim for her death. This is only to indicate this is neither a new problem, nor one that is directly traceable to TikTok. Stupid shit is cyclical. The new constant is social media services with millions or billions of users that can cause stupidity to spread further and faster.
What this isn’t is an actionable legal case. TikTok possibly could have done more to suppress surfacing of “blackout challenge” content. What it didn’t do is present its service as a harmless diversion. Moderation at scale is impossible. And no one truly expects social media services to value users personally over total users as an aggregate. Whatever increases the latter is bound to get more attention from those designing engagement algorithms. But none of this adds up to product liability. And pretending it is just to avoid certain dismissal under Section 230 is a waste of a plaintiff’s time and energy. To make it clear, I’m not upset this parent decided to sue. I’m upset her lawyer wasn’t honest enough to let her know this effort would likely result in nothing but more heartache.
Filed Under: blackout challenge, blame, challenges, moral panic, section 230, viral videos
Companies: tiktok