Lawsuit Attempts To Hold TikTok Responsible For Death Of 10-Year-Old Girl Who Engaged In A ‘Blackout Challenge’

from the challenge-that-is-neither-new-nor-limited-to-TikTok dept

When a tragedy happens, lawsuits tend to follow. This is no exception. And while it’s understandable that grieving survivors often seek justice — whether it’s closure or compensation — through the legal system, the legal system is not there to provide solace. It’s there to determine whether anyone was legally culpable for the death.

This lawsuit, brought to us by Adam Klasfeld at Law & Crime, is another misguided attempt to hold a social media service directly responsible for a person’s actions. In this case, it’s the death of a 10-year-old TikTok user who allegedly participated in a “blackout challenge” that began with her asphyxiation in her bedroom and ended five days later when she passed away at a local hospital.

But TikTok isn’t responsible for this death. And the lawsuit [PDF] has very little chance of piercing TikTok’s Section 230 immunity, even if it’s deliberately crafted to avoid discussion of third party liability.

The lawsuit opens by making it seem as though this is a problem unique to TikTok, which would make it unquestionably directly responsible for this child’s death. (Emphasis in the original.)

The viral and deadly TikTok Blackout Challenge was thrust in front of Nylah on her TikTok For You Page (“FYP”) as a result of TikTok’s algorithm which, according to the TikTok Defendants, is “a recommendation system that delivers content to each user that is likely to be of interest to that particular user…each person’s feed is unique and tailored to that specific individual.”

The TikTok Defendants’ algorithm determined that the deadly Blackout Challenge was well-tailored and likely to be of interest to 10-year-old Nylah Anderson, and she died as a result.

The TikTok Defendants’ app and algorithm are intentionally designed to maximize user engagement and dependence and powerfully encourage children to engage in a repetitive and dopamine-driven feedback loop by watching, sharing, and attempting viral challenges and other videos. TikTok is programming children for the sake of corporate profits and promoting addiction.

Social media algorithms are definitely designed to promote engagement. That can’t be argued. But if the algorithm suggested the “blackout challenge” to Nylah Anderson, that suggestion is traceable to Anderson’s interactions with the service, as well as other users’ actions, which supposedly made the challenge “viral” (to use Law & Crime’s headline wording) and more likely to surface as a suggestion to TikTok users.

What this doesn’t indicate is that TikTok saw Anderson’s account and decided — without other contributing factors — to recommend she participate in a deadly “challenge.”

TikTok has long been accused of promoting “viral” challenges that can result in injuries or death. Most of the virality is due to hyperbolic, breathless coverage of something someone saw on the internet, rather than actually based on trending videos hosted by the platform. What happened here isn’t a malicious or negligent act by TikTok. If the allegations are true, the algorithm surfaced something that was trending. It did not target a 10-year-old with a deadly challenge.

Whatever surfaced in the “For You” section is shielded by Section 230 and otherwise protected by the First Amendment. The plaintiff and her legal reps want to dodge this inevitable discussion by framing this as a defective product complaint. And they disingenuously pretend this has nothing to do with other TikTok users who contribute to algorithmic promotion of trending content.

Plaintiff does not seek to hold the TikTok Defendants liable as the speaker or publisher of third-party content and instead intends to hold the TikTok Defendants responsible for their own independent conduct as the designers, programmers, manufacturers, sellers, and/or distributors of their dangerously defective social media products and for their own independent acts of negligence as further described herein. Thus, Plaintiffs claims fall outside of any potential protections afforded by Section 230(c) of the Communications Decency Act.

Sure, you can assert that in a federal complaint. But that doesn’t mean judges are obligated to pretend the allegations have no nexus with Section 230.

The details of TikTok’s recommendation system are cited as being a contributor to this death and an indicator that the app’s creators are distributing an intentionally flawed product that values profits above user safety. And, while it’s certainly true profitability and user engagement are more of a concern than the distribution of potentially harmful content, TikTok gives users the tools to curate their feed, as well as information as to how their algorithmic recommendations are compiled. It’s not a complete black box and it indicates TikTok is at least making some effort to limit exposure to harmful content.

The lawsuit is crafted to avoid Section 230 discussions but there’s simply no way to avoid them when you’re dealing with algorithmic recommendations that rely heavily on third party content and user’s own interactions with the service. Beyond that, there’s the First Amendment, which allows social media platforms to decide what content it promotes and which content it chooses to ban or hide.

The long list of supposedly “viral” challenges that have gone “viral” on TikTok included in the lawsuit is meaningless. Even this challenge (whose virality is, at best, disputed) is not something unique to TikTok. It dates back to at least 2008. It apparently resurfaced in 2014, where it was again granted news coverage and resulted in doctors being asked to explain why choking yourself might be dangerous. Late last year, it reappeared like a cicada/horror movie villain to deliver more media consternation and more obvious statements that any challenge that utilizes asphyxiation is harmful.

This is not to blame the victim for her death. This is only to indicate this is neither a new problem, nor one that is directly traceable to TikTok. Stupid shit is cyclical. The new constant is social media services with millions or billions of users that can cause stupidity to spread further and faster.

What this isn’t is an actionable legal case. TikTok possibly could have done more to suppress surfacing of “blackout challenge” content. What it didn’t do is present its service as a harmless diversion. Moderation at scale is impossible. And no one truly expects social media services to value users personally over total users as an aggregate. Whatever increases the latter is bound to get more attention from those designing engagement algorithms. But none of this adds up to product liability. And pretending it is just to avoid certain dismissal under Section 230 is a waste of a plaintiff’s time and energy. To make it clear, I’m not upset this parent decided to sue. I’m upset her lawyer wasn’t honest enough to let her know this effort would likely result in nothing but more heartache.

Filed Under: , , , , ,
Companies: tiktok

Rate this comment as insightful
Rate this comment as funny
You have rated this comment as insightful
You have rated this comment as funny
Flag this comment as abusive/trolling/spam
You have flagged this comment
The first word has already been claimed
The last word has already been claimed
Insightful Lightbulb icon Funny Laughing icon Abusive/trolling/spam Flag icon Insightful badge Lightbulb icon Funny badge Laughing icon Comments icon

Comments on “Lawsuit Attempts To Hold TikTok Responsible For Death Of 10-Year-Old Girl Who Engaged In A ‘Blackout Challenge’”

Subscribe: RSS Leave a comment
96 Comments
This comment has been deemed insightful by the community.
PaulT (profile) says:

“The TikTok Defendants’ algorithm determined that the deadly Blackout Challenge was well-tailored and likely to be of interest to 10-year-old Nylah Anderson”

I watched all sorts of shit that I have no intention of imitating all the time, even at 10 years old when I was watching banned “video nasty” VHS tapes. But, I never copied any of them in my actions in real life.

“and she died as a result”

…of deciding to copy whatever was in the video despite it clearly being dangerous and stupid, and without anyone around her to discuss or give a different perspective.

“TikTok is programming children for the sake of corporate profits and promoting addiction”

So… it’s a corporation?

This comment has been deemed insightful by the community.
That Anonymous Coward (profile) says:

Re:

Right after they arrest & charge the parents for neglect.

TikTok is not responsible for this.

They did not put the phone in the childs hand, they did not pay the bill, they did not have a duty to ask what the child was looking at online, but the parents did.

Nathan F (profile) says:

Re: Re:

Oh I’m not arguing that the parents didn’t have a responsibility to be doing the parental oversight thing. I’m just asking if tiktoc can get dinged for letting a 10 year old on in the first place cause that law I mentioned in the first post says child accounts have to be treated differently and with more “safeguards “

Anonymous Coward says:

Re: Re:

they did not have a duty to ask what the child was looking at online, but the parents did.

Did they? Is there any case law supporting that? Statements like this bother me; my parents didn’t know what I was doing on BBSes and the internet as a kid (nor at the park or walking around the neighborhood, with any specificity). Had I felt like I’d have to live through adolescence in the modern “parents spy on everything you do” world, I don’t think I would’ve made it to adulthood. Privacy is a human right, and not one that should be reserved for adults.

Anonymous Coward says:

Re: Re: Re:

Agreed. Kids having their own privacy is important. A kid questioning their gender identity in an intolerant household could find themselves kicked out, abused, or worse if their parents found out. The notion that parents have to monitor their kids’ behavior 24/7 has glaring flaws.

TaboToka (profile) says:

Re: Re:

It seems natural to want to point and assign blame (the collective towards the parents or the parents towards TikTok), but the best we can do is learn and maybe look at prevention.

How could this have been prevented?

  • TikTok cannot stop kids from viewing things they could use to injure or kill themselves (i.e., moderation at scale).
  • Part of being a kid is doing stupid things, because their executive function isn’t fully developed. We cannot change the kids.
  • Parents can and should monitor what their kids are up to. They can’t catch everything, but if they find their kid is hurting or needs help, they can and should address it.
Anonymous Coward says:

Re: Re:

Are the parents able to oversee their child’s consumption of media 100% of the time? The “parents have to be omniscient Gods with infinite money so they don’t have to work to pay for things and thus be around to supervise their child 100%” schtick is getting really fucking old.

Anonymous Coward says:

Re: Re: Re:

Are the parents able to oversee their child’s consumption of media 100% of the time?

It’s the parents’ responsibility to oversee their children. That it can be difficult is irrelevant. They chose to have the child, and they need to accept the responsibility for the child’s safety.

The “parents have to be omniscient Gods with infinite money so they don’t have to work to pay for things and thus be around to supervise their child 100%” schtick is getting really fucking old.

As are parents that blame everyone but themselves while the child was in their care.

Naughty Autie says:

Re: Re: Re:3

Even if the law of some (not all) of the United States prevents you getting an abortion, you can still give the kid up for adoption rather than allow them to have completely unsupervised access to a personal communications device until they accidentally commit suicide at ten years old. Chump.

Anonymous Coward says:

Re: Re: Re:5

Because someone else made forced them to have a child. That someone should then take responsibility for that action.

Well, yeah. I 100% agree.

But we don’t know if this child was wanted, unwanted, or otherwise. All I was pointing out was the circumstances under which the child was born doesn’t matter. You can’t use that as an excuse to avoid responsibility as the child grows up.

Anonymous Coward says:

Re: Re: Re:3

Then she acts like an adult and puts the kid with its father, or in foster care, or up for adoption, or in a box on the front steps of a local monastery. You don’t get to dodge responsibility in the real world by ignoring it, and no amout of misappropriating pro-choice talking points is going to change that.

This comment has been deemed insightful by the community.
That Anonymous Coward (profile) says:

“TikTok is programming children for the sake of corporate profits and promoting addiction.”

Sadly this child did not have parents who gave a shit until they could get paid.

They spent no time seeing how their offspring used social media, the expensive phone, or decisions she was making.

They had no rules about usage of the app, I guess because at the time they assumed a corporation would take much better care of their offspring then actual parents should/would/could.

How long did the child lay there before the parents decided they should check on their offspring?

The lawsuit lists all of these other challenges that resulted in people being hurt and at no point did you uninstall TikTok from the phone or talk to your offspring that sometimes things you see aren’t good.

Kids watched the Superman movie, put a towel around their neck and leapt from the top of the garage, and those parents sued claiming the movie made them do it.
Imagine if a parent actually paid attention and asked a question about the gathering of a towel, a ladder…

Its not polite, but I am rarely accused of being polite, you abdicated being a parent. You turned your child over to the internet and an app expecting they would do the job you refused to do. You ‘discovered’ that she had been presented with other version of this challenge after the fact, perhaps if you had been available to your child & monitoring what she did.

You can try to blame everyone else & get a payday to sooth your soul, but you failed and no court is gonna change that fact.

Anonymous Coward says:

Re: Tiktok is not blameless.

Oh to be sure, the parents are partly responsible, but this does not excuse Tiktok’s culpability in my book. Tiktok is not blameless in this matter of death of the child,and ought to be held accountable in whatever part if any it had played in the child’s death. A child is not raised alone by the parents but society as well, just facts of life, no matter how much we as a western capitalist society value individuality and no matter how much hands-off we want to be about the raising of children not ours. Society especially certain elements like corporations should not shrink responsibility in this when it comes to the well-being of the children or it will be bad in the long road. Social media that caters to young children ought to take special care. If it can proven that this death is preventable by whatever some reasonable steps Tiktok could have taken and section 230 prevents Tiktok from being legally responsible for it then this law has a problematic flaw in my opinion.

That Anonymous Coward (profile) says:

Re: Re:

“Society especially certain elements like corporations should not shrink responsibility in this when it comes to the well-being of the children or it will be bad in the long road.”

You mean like 1 corporation getting control of 40% of the baby formula market, cutting corners at a plant, making children ill, but they made sure the replacement for NAFTA made the importation of baby formula was all but impossible?

Or like governments who are still dragging their feet about testing water supply lines in ‘poor’ areas.

Or like the meat processors who were taking bets on how many workers would die from covid after conning the Trump administration that they needed to stay open no matter what.

I was unaware that the TikTok target demo was 10 yr olds, I thought they were actually aiming higher.

Do you also believe that Insta should be held responsible for a child who develops an eating disorder in attempting to reach the idealized images users post?

No corporation is a baby sitter, and assuming that they give a shit about anything other than their profits is shocking in this day and age.

Should Ford have to pay because a bank robber drive a ford away from a robbery? The robber might have hurt someone so ford has a duty to make sure their cars can’t be used by bank robbers to keep everyone safe?

Naughty Autie says:

Re: Re:

How many videos does it host? As everyone knows, moderation at scale is impossible to do perfectly, which is why there are report buttons to draw human moderators’ attention to problematic content, yet obviously no one used them to flag videos containing potentially fatal challenges. In this case, therefore, TikTok is blameless, it’s just easier for grieving parents to sue the site and not the users who should have done more, as well as not accepting their own share of the blame.

nasch (profile) says:

Re:

Sadly this child did not have parents who gave a s*** until they could get paid.

They spent no time seeing how their offspring used social media, the expensive phone, or decisions she was making.

They had no rules about usage of the app, I guess because at the time they assumed a corporation would take much better care of their offspring then actual parents should/would/could.

This sounds like a lot of baseless speculation.

That Anonymous Coward (profile) says:

Re: Re:

If they had been looking they would have seen the first time a black out challenge was presented to the child to view.

If they had had a conversation with the child about not just doing things because you saw it on an app.

Having seen all of the media coverage claiming TikTok is the cause of all child hooliganism, what sort of parent allows a 10 yr old to have access to it?

nasch (profile) says:

Re: Re: Re:

If they had been looking they would have seen the first time a black out challenge was presented to the child to view.

How do you know this wasn’t the first time? Are they to continuously watch over their child’s shoulder during screen time?

If they had had a conversation with the child about not just doing things because you saw it on an app.

How do you know they didn’t?

Having seen all of the media coverage claiming TikTok is the cause of all child hooliganism, what sort of parent allows a 10 yr old to have access to it?

If parents believed all the moral panics, they wouldn’t let their kids do anything at all.

Anonymous Coward says:

Re: Re: Re:2

If parents believed all the moral panics, they wouldn’t let their kids do anything at all.

Agreed. But if you break it down on a more fundamental way, the Internet in general isn’t a safe place, especially if you’re young. I remember the days of Chris Hansen showing up to greet people who would graciously meet a child for who-knows-what.

That isn’t on the level of the Satanic Panic stuff, it’s basic online safety, and not letting your child do something that might compromise it.

nasch (profile) says:

Re: Re: Re:3

That isn’t on the level of the Satanic Panic stuff, it’s basic online safety, and not letting your child do something that might compromise it.

I agree, but I also think we can have some compassion for parents who don’t get it right. Just because a child dies doesn’t mean their parents didn’t care. Maybe they just made a mistake – which all of us do. “Let’s learn from this” is a better attitude than “welp guess you should have paid attention to your kid when you had the chance” IMO.

That Anonymous Coward (profile) says:

Re: Re: Re:2

“How do you know this wasn’t the first time?”

Because I read the complaint where they specifically spelled out that the video that allegedly forced her to take her own life wasn’t the first time a version of that challenge had been put onto the childs timeline.

“How do you know they didn’t?”

points at the casket

“If parents believed all the moral panics, they wouldn’t let their kids do anything at all.”

stares in we had a story about a mother arrested for allowing her children to go to the park on their own

That Anonymous Coward (profile) says:

Re: Re: Re:4

No, but parents are meant to keep tabs on their children & teach them what the rules of life are.

How many other stupid things had she seen on TikTok and tried?

How many times did the parent ask, what did you see today, what are you doing, why are you doing, show me what make you laugh?

How many times did the parent take the phone & look at the history? 10 yr olds shouldn’t expect much privacy from their parents and given all of the moral panics about how children are being sucked into phones and trafficked globally perhaps the responsibility to look at what your child is doing & reminding them of what you expect is something that needs more press coverage than this corporation did less than I did to protect my baby!

Just think of me as Death filling in for The Hog Father…
https://9gag.com/gag/aGzGbv0

That Anonymous Coward (profile) says:

Re: Re: Re:6

My well of compassion (ok its a really really shallow puddle) goes right out the window once they start screaming it is everyone elses fault and they should pay me.

Like the parents of the girl who killed other people and her passengers (iirc) who sued Apple because they had a patent on the idea of disabling phones in motion but had never built anything.
It wasn’t the daughters fault she was speeding, it wasn’t the daughters fault she was texting, it was the fault of Apple because the corporation should have stopped their kid from texting and driving.

It is one thing to lash out in a moment of grief to the media or on facebook and yell at TikTok, it is another to demand payment because they should have taken better care than you took of your child.

Tanner Andrews (profile) says:

Re: Re: Re: Oh Look, Timmy Fell Down the Well Again, First Time this Week

what sort of parent allows a 10 yr old to have access to it

The problem is that kids are sneaky. They will do things without the parents’ knowledge. Maybe after the first few weeks of the series, you might think that Timmy’s parents should lock him in his room, but no one was arguing that someone should call Child Protective Services.

Kids have ways to evade parental oversight. It is probably important for their mental health that they should have some privacy, but even ignoring that, kids will find ways to be alone.

We are not going to be able to fill in all the wells, get Timmy a brain implant instead of relying on the dog, or keep kids off the internet. Nor can we really hope to keep the internet ``safe”, what with the combination of US 1st Amendment and other countries willing to host material not allowed in the States.

nasch (profile) says:

Re: Re: Re:

the idea that they were allowed entirely unsupervised access to an internet-enabled device is hardly “a lot of baseless speculation.”

That’s not all that was said. He said things like the parents “didn’t give a shit” and “had no rules” about social media usage. Based on, as far as I could tell, absolutely no evidence. In other words, baseless speculation.

Dipshit.

Asshole.

Tanner Andrews (profile) says:

Re: Re: Re:3 Duty of Care

So basically the same basis for the lawsuit claiming that TikTok had a duty to do all sorts of things.

Pretty much. The plaintiff tries to claim violation of two states’ respective Unfair and Deceptive acts, but it comes down to claiming that they held the app out as safe yet someone watched bad videos.

Gonna sue the TV manufacturer when a kid attaches a towel around his neck and jumps off the roof, thinking he will fly like Superman in the cartoons? Might as well.

nasch (profile) says:

Re: Re: Re:3

Says the one defending parents that literally killed their child through neglect.

I’m defending telling the truth and speaking with compassion. Making up things that we don’t know about to make the parents look bad might make us feel better, because we can reassure ourselves that they are bad parents and we are good parents, so that could never happen to us. But it isn’t a useful approach to the situation.

aethercowboy (profile) says:

Not knowing much about Tik-Tok other than it’s not just a character from the Wizard of Oz stories, hearing this made me wonder if they had age gates to prevent kids under 13 from being able to use the service.

The fact that a 10-year-old had unfettered access to a social website without any parental supervision makes me, a parent of a 10-year-old, baffled.

I guess the parents don’t want to admit that they’re just negligent, and want to pass the buck.

This comment has been deemed insightful by the community.
discussitlive (profile) says:

Re: Tech and children

I used to tech at K-12 for decades for the entire district in the main office. The ability of children to find ways around technological blocks and barriers is surpassed only by those with only a pure profit motive. And the dad’s that would use the kid’s laptop after the kid went to bed was always an issue.
To put it as politely as I am able after that experience: Technology is not a babysitter and trusting it to enforce your standards is a sure path to disappointment. Computers compute. Only parents can parent.

Anonymous Coward says:

1) Did the girl lie about her age?
Most likely. I believe that TikTok will not create an account if the age entered is that young.
2) “And while it’s understandable that grieving survivors often seek justice — whether it’s closure or compensation — through the legal system,” is incorrect. It is usually not the parents in a case like this who suggests suing TikTok etc. It is attorneys.

Anonymous Coward says:

Ignoring the Painfully Obvious, much?

not the Elephant, but the Blue Whale in the room.

so.

this “unsuspecting” “innocent” child was quietly sitting in her bedroom reading a book, and then RRRAAAGGHHH!!! the big old mean TikTok BROKE DOWN THE DOOR TO HER ROOM AND,…!!!!

uh huh.

and the child’s parents,… i guess were,.. what? first GAGGED & TIED UP so they would not be able to protect their child–

oh,.. riiiight:

it was THE PARENTS that bought the child the cellular telephone as a surrogate babysitter,…

it was THE PARENTS that purposefully refused to give their child ANY guidance about the dangers of the Internet,…

it is THE PARENTS that are ultimately responsible.

Anonymous Coward says:

Re: An here we have it: The illusion of total control

… and because the library had books describing LGBTQ relationships as other than abhorrent, it is THE PARENTS fault for letting the child read and develop tolerant views.

Do I have the idiom right?

If it isn’t the smart phone, it’s the computer, the television, the telephone, the library, the poetry circle, Those Kids Down The Block, or something else.

You are correct, that the parents are ultimately responsible, in that they are legal guardians of a minor. Where you fail, though, is assuming that they are able to control every aspect of a child’s thinking and actions.

That’s the problem with creating self-programming autonomous entities. They don’t always turn out how you wish. The most you can do is give them training sets that you hope will generate a sociologically adjusted individual.

Anonymous Coward says:

Re: Re:

Where you fail, though, is assuming that they are able to control every aspect of a child’s thinking and actions.

Frankly, that’s not TikTok’s responsibility to fill in the gap. And it’s naive to assume that you should leave a 10-year-old alone on the Internet. That is wholly the fault of the parents.

Anonymous Coward says:

Re: Re: Re:3

choose to have a child, and you need to make sacrifices.

Not every parent chooses to have a child, dumbass. Unwanted accidental pregnancies are a thing and abortion rights are quickly being demolished here in the shithole that is the United States.

If the parents needed assistance, they should’ve asked.

Because nannies and babysitters and other childcare services totally work for free, and when they don’t, it’s something that any and all parents can afford, right? /s

Anonymous Coward says:

Re: Re: Re:4

Unwanted accidental pregnancies are a thing and abortion rights are quickly being demolished here in the shithole that is the United States.

What difference does that make as far as your responsibility to the child? This kid was already here – you can’t just say ‘well, it was an unwanted pregnancy so that exonerates the parent.’

Naughty Autie says:

Re: Re: Re:4

Not every parent chooses to have a child, dumbass. Unwanted accidental pregnancies are a thing and abortion rights are quickly being demolished here in the shithole that is the United States.

If you don’t want a child and can’t abort, you give them up for adoption. If there are times you can’t be there for the child, you hire a baby sitter, a child minder, or an au pair. Come back to this debate when you have an actual point. Dipshit.

Anonymous Coward says:

Re: Re: Re:6

And if the reason you can’t be there is you have no job, and need to go searching for a job, how do you pay?

Hence the reason why having a child is an important decision, that should not be undertaken lightly, nor forced upon someone against their will.

We’ll likely find out soon enough when there’s an abundance of unwanted and unplanned children walking around. But I wouldn’t expect it to be pretty.

PaulT (profile) says:

Re: Re: Re:7

There’s also a lot of children who were brought into this world by parents who could afford them initially, but the state of the economy made it more difficult over the span of their childhood. Plenty of parents have had a child when they had a good job, had that job shipped overseas or removed when their employer went bust and had to survive on a fraction of what they had when they started their family.

If people come to this subject under the assumption that over a child’s life the parents’ income can only go up and the expenses never can, they’re delusional.

PaulT (profile) says:

Re: Re: Re:5

“If you don’t want a child and can’t abort, you give them up for adoption”

I’d read up on the state of the adoption services in the US if I were you. Not only isn’t it pretty, but it’s often deliberately sabotaged by the same people saying that women have no choice other than to carry to term.

“If there are times you can’t be there for the child, you hire a baby sitter, a child minder, or an au pair.”

Because everyone has that money lying around…

Naughty Autie says:

Re: Re: Re:2

When I was ten, my parents never let me go online unless they were in the house so they could occasionally check on me and what sites I was accessing. Of course, they did make their task easier by putting an age filter on my account on the PC and not letting me have a mobile phone until they felt that I was mature enough.

Christenson says:

Warning notice

heya, Techdirt, I’m pretty sure we’ve seen this exact same article before, but about terrorists and one of their victims.

Anyway, what comes to mind here is that when we have posts of people doing stupid things and going viral, it might be helpful to have links to cautionary tales and/or notes. This particular challenge also minds me of the one or two deaths per year the US records due to autoerotic asphyxia.

I’m also reminded of my misspent youth, digging in dirtpiles for fun, with my dad reminding me that the roof in an actual tunnel would be a problem, and the couple of kids that have died this week when their tunnels in sand collapsed on them.

This comment has been deemed insightful by the community.
This comment has been deemed funny by the community.
TasMot (profile) says:

"for the sake of corporate profits"

“TikTok is programming children for the sake of corporate profits and promoting addiction.”

I’m not a corporate accountant or anything, but I’m absolutely, positively sure that killing off your users is not good, in any way, for maintaining or increasing corporate profits.

Anonymous Coward says:

Re: Re: Re:2

Because 10-year-olds are supposed to keep in the know of current events that happened when they were 8 years old, and have perfectly-developed informational literacy capabilities, right? If this 10 year old didn’t read the news or do their research about algorithms or how to curate their feed, it’s entirely their fault that they died.

/s

This comment has been flagged by the community. Click here to show it.

Anonymous Coward says:

Re: Re: Re:4

If their child had died in a school shooting, you’d probably try to blame the parents too, if your track record is any indication. Blame them for not having enough money to put them in a better school, or not teaching the child how to dodge out of the way of bullets a la Neo from The Matrix, or some shit like that. After all, you seem to think that every single parent in the world chose to have their child 100%, and parenting requires sacrifice.

Anonymous Coward says:

Re: Re: Re:5

If a child was hurt in a school shooting, the problem is the shooter, no? That’s a big leap you’re making there inferring that I’d say the parents are at fault.

After all, you seem to think that every single parent in the world chose to have their child 100%, and parenting requires sacrifice.

I certainly don’t. But once the child’s here, the choice is kinda irrelevant.

That Anonymous Coward (profile) says:

Re:

No the court will not be defrauded.
The court will treat the case like the 100’s of others just like it that are filed.

The only person exploited will be the parent.
They will be out the money & the added bonus of the lawyer exploiting them to get more clients to pay them for pretty lies that make them feel better.

They are promised that this case will hold someone responsible for a tragedy & let them assign the blame, then move forward with their life.
Instead they will discover that the law says the company isn’t responsible for the tragedy, they will have to try to find something else to blame & will spend the rest of their life trying to assign that blame anywhere as long as it is not near them.

John85851 (profile) says:

Lawyers

While everyone is debating whether the death was the fault of TikTok or the parents, can we all agree that the lawyers should know better? We can debate and discuss, but I bet most of us aren’t lawyers or legal experts.

Yet real lawyers filed this real lawsuit even though they should know the law and they should know the outcome. Yet the same lawyers took the parents’ money with the promise that they would win the case.
I’ve said many times, but I think it’s past time that judges disbar lawyers for filing lawsuits like this.

Add Your Comment

Your email address will not be published. Required fields are marked *

Have a Techdirt Account? Sign in now. Want one? Register here

Comment Options:

Make this the or (get credits or sign in to see balance) what's this?

What's this?

Techdirt community members with Techdirt Credits can spotlight a comment as either the "First Word" or "Last Word" on a particular comment thread. Credits can be purchased at the Techdirt Insider Shop »

Follow Techdirt

Techdirt Daily Newsletter

Ctrl-Alt-Speech

A weekly news podcast from
Mike Masnick & Ben Whitelaw

Subscribe now to Ctrl-Alt-Speech »
Techdirt Deals
Techdirt Insider Discord
The latest chatter on the Techdirt Insider Discord channel...
Loading...