Coroner Lists ‘Negative Effects Of Online Content’ As One Of The Causes Of A UK Teen’s Death

from the yikes dept

So… this is a thing that happened. Adam Satariano reports for the New York Times:

The coroner overseeing the case, who in Britain is a judgelike figure with wide authority to investigate and officially determine a person’s cause of death, was far less circumspect. On Friday, he ruled that Instagram and other social media platforms had contributed to her death — perhaps the first time anywhere that internet companies have been legally blamed for a suicide.

“Molly Rose Russell died from an act of self-harm while suffering from depression and the negative effects of online content,” said the coroner, Andrew Walker. Rather than officially classify her death a suicide, he said the internet “affected her mental health in a negative way and contributed to her death in a more than minimal way.”

This was the declaration entered as evidence in a UK court case revolving around the suicide of 14-year-old Molly Russell. Also entered as evidence was a stream of disturbing content pulled from the deceased teen’s accounts and mobile device — content that included videos, images, and content related to suicide, including a post copied almost verbatim by Russell in her suicide note.

The content Russell apparently viewed in the weeks leading up to her suicide was horrific.

Molly’s social media use included material so upsetting that one courtroom worker stepped out of the room to avoid viewing a series of Instagram videos depicting suicide. A child psychologist who was called as an expert witness said the material was so “disturbing” and “distressing” that it caused him to lose sleep for weeks.

All of this led to Meta executives being cross-examined and asked to explain how a 14-year-old could so easily access this content. Elizabeth Langone, Meta’s head of health and well-being policies, had no explanation.

As has been noted here repeatedly, content moderation at scale is impossible. What may appear to be easy access to disturbing content may be more a reflection of the user than the platform’s inability to curtail harmful content. And what may appear to be a callous disregard for users may be nothing more than a person slipping through the cracks of content moderation, allowing them to find the content that intrigues them despite efforts made by platforms to keep this content from surfacing unwelcomed on people’s feeds.

This declaration by the UK coroner is, unfortunately, largely performative. It doesn’t really say anything about the death other than what the coroner wants to say about it. And this coroner was pushed into pinning the death (at least partially) on social media by the 14-year-old’s parent, a television director with the apparent power to sway the outcome of the inquest — a process largely assumed to be a factual, rather than speculative, recounting of a person’s death.

Mr. Russell, a television director, urged the coroner reviewing Molly’s case to go beyond what is often a formulaic process, and to explore the role of social media. Mr. Walker agreed after seeing a sample of Molly’s social media history.

That resulted in a yearslong effort to get access to Molly’s social media data. The family did not know her iPhone passcode, but the London police were able to bypass it to extract 30,000 pages of material. After a lengthy battle, Meta agreed to provide more than 16,000 pages from her Instagram, such a volume that it delayed the start of the inquest. Merry Varney, a lawyer with the Leigh Day law firm who worked on the case through a legal aid program, said it had taken more than 1,000 hours to review the content.

What they found was that Molly had lived something of a double life. While she was a regular teenager to family, friends and teachers, her existence online was much bleaker.

From what’s seen here (and detailed in the New York Times article), Molly’s parents didn’t take a good look at her social media use until after she died by suicide. This is not to blame the parents for not taking a closer look sooner, but to point out how ridiculous it is for a coroner to deliver this sort of declaration, especially at the prompting of a grieving parent looking to find someone to blame for his daughter’s suicide.

If this coroner wants to list contributing factors on the public record — especially when involved in litigation — they should at least be consistent. They could have listed “lack of parental oversight,” “peer pressure,” and “unaddressed psychological issues” as contributing factors. This report is showboating intended to portray social media services as harmful and direct attention away from the teen’s desire to access “harmful content.”

And, truly, the role of the coroner is to find the physical causes of death. We go to dangerous places quickly when we start saying that this or that thing clearly caused someone to die by suicide. We don’t know. We can’t know. Even if someone were trained in psychology (not often the case with coroners) you still can’t ever truly say what makes a person take their own life. There are likely many reasons, and they may all contribute in their own ways. But in the end, it’s the person who makes the decision themselves, and only they know the real reasons.

As Mike has written in the past, when we officially put “blame” on parties over suicide, it actually creates very serious problems. It allows those who are considering suicide the power to destroy someone else’s life as well, by simply saying that they chose to end their life because of this or that person or company or whatever — whether or not there’s any truth to it.

I’m well aware social media services often value market growth and user activity over user health and safety, but performative inquests are not the way to alter platforms’ priorities. Instead, it provides a basis for bad faith litigation that seeks to hold platforms directly responsible for the actions of users.

This sort of litigation is already far too popular in the United States. Its popularity in the UK should be expected to rise immediately, especially given the lack of First Amendment protections or Section 230 immunity.

It’s understandable for parents to seek closure when their children die unexpectedly. But misusing a process that is supposed to be free of influence to create “official” declarations of contributory liability won’t make things better for social media users. All it will do is give them fewer options to connect with people that might be able to steer them away from self-harm.

Filed Under: , , , , ,
Companies: meta

Rate this comment as insightful
Rate this comment as funny
You have rated this comment as insightful
You have rated this comment as funny
Flag this comment as abusive/trolling/spam
You have flagged this comment
The first word has already been claimed
The last word has already been claimed
Insightful Lightbulb icon Funny Laughing icon Abusive/trolling/spam Flag icon Insightful badge Lightbulb icon Funny badge Laughing icon Comments icon

Comments on “Coroner Lists ‘Negative Effects Of Online Content’ As One Of The Causes Of A UK Teen’s Death”

Subscribe: RSS Leave a comment
Samuel Abram (profile) says:

A terrible precedent

This has the bad side effect of making coroners untrustworthy and co-conspirators in moral panics. It would be like the APA listed “Video Game Addiction” and “Video-game-caused desensitization to violence” as actual psychological disorders. What this does ultimately is turn coroners into political tools and compromise their professional integrity.

This comment has been flagged by the community. Click here to show it.

This comment has been deemed insightful by the community.
Anonymous Coward says:

Re: Throwing Out The Baby With The Bathwater much?

So instead of addressing the reasons WHY this girl killed herself and working on that your solution is strip everyone of privacy and blame tech companies.

Let me be clear, it is nothing short of a tragedy it happened but people don’t become suicidal for no reason and it’s not tech companies who are the root cause. If I’m remembering right, services to help people like Molly are not being allotted the funds they need to do their job. Maybe focus on THAT instead of blaming tech companies for all previeved failings.

The answer is “let’s invest more in adressing the root cause of why she was suicidal” not “let’s turn the UK into a splinternet that won’t do anything to help children and would actually make things worse for them”.

Cowardly Lion says:

Re: Re: What?

Went broke trying to maintain the Empire, didn’t they?

Not really; they lost 2 generations defending Europe and the world against terrible, despotic oppression. It was the fact that they could draw on that empire that helped prevent the world being dominated by madmen.

And I wouldn’t call them broke.

Anonymous Coward says:

Re: Re: Re:

World War 1 was not about fighting against oppression. It was, quite literally, senseless.

And the British could never recover from that, to the point they were too fucking broke to shore up their defenses for WW2.

And the result? The Imperial Japanese occupation of Southeast Asia, COMPLETE WITH WARCRIMES.

That’s your Fucking British Empire.

John85851 (profile) says:

What about the posters?

So the story talks about how Facebook and Instagram are to blame for not moderating content, but what about the people who posted the content in the first place?

Ah, yes, because it’s easier to haul the Facebook executives into court than to track down a user who might be halfway around the world.
Content moderation would be a lot easier if people didn’t post mean stuff to begin with, but that requires tougher questions and solutions. So, to make things easy, just blame the companies and websites.

This comment has been deemed insightful by the community.
glenn says:

Hey, mom and dad… your closure is in the mirror. How many years did you ignore your child’s behavior and actions? All of them? Did you ever really even want to have children? Or was it just “everyone else is doing it ’cause that’s what folks are supposed to do” …or whatever. Parenthood means responsibilities… and you failed to handle yours. Sadly, you’re a trend.

Anonymous Coward says:

Before I start, I agree that content moderation at scale is impossible, I just want to add something that wasn’t in the article above, but was part of the Coroner’s decision.

From – the ruling mentioned the algorithms utilised by the social media companies to continue user engagement based on similar content they were viewing.

“The platforms operated in such a way using algorithms as to result, in some circumstances, of binge periods of images, video clips and text,” which “romanticized acts of self-harm” and “sought to isolate and discourage discussion with those who may have been able to help,” Walker said.

It’s not a very elegant summation of how the algorithm operates, or what the coroner would hope should happen.

It is an interesting discussion point if, in fact, the platforms’ algorithms were able to send similar content that the girl was viewing already to keep her engaged.

It may merit study to see if that is actually the case, or otherwise.

Leaving that part alone may set dangerous precedents via Government action that is misinformed.

That Anonymous Coward (profile) says:


The algorithm killed Jesus.

The algorithm is dumb.

The algorithm has learned that people who search for or engage with X, often also enjoy Y and Z.

The algorithm has 1 job, keep the user engaged.

If the user stops looking at X but more at Y and not so much Z, the algorithm knows to serve up G I K because users who like Y more than Z also like G I K.

It has no way to understand what any of these things are.

There are not enough humans to review everything under each letter identification, it is not like you can say that all self harm content always ends up under Q, because it doesn’t.
See also: The YouTube Rabbit Hole where you start out with some fairly normal videos but following some of the suggestions you find yourself watching some of the weirdest crap ever, 4 hours later.

People keep insisting that the companies have a duty to protect children but at the same time give parents a pass on having to participate in their childs life.

People think companies can just nerd harder and we’ll have perfect AI overlords to show adults the right amount of sexy time content they like while making sure a child never see’s an uncovered boob.

You put a several hundred dollar computer with unlimited access to the world into a childs hands… but not once did you ask to see what your child was doing or look at their history, instead believing that the internet is a completely safe place because corporations will protect them….

Humans are stupid.

Anonymous Coward says:

Re: Re:

true that

The job of a social media algorithm can simply be expressed as to recommend content to a user that they are likely to enjoy. The problem is caused by the metrics that are used by the algorithm to judge how much a user likes certain types of content.

For example, if I was to, say, watch videos about X for most of the video, and was likely to upvote/like/whatever said videos, the algorithm would assume that I enjoy this type of content and recommend more of it.

However, if the algorithm then decided for whatever reason to show me a video of Y (say it was incredibly popular with others and I had not spent enough time on the platform for it to learn my tastes fully, thus basing its recommendations on its knowledge of what the general user base of the platform liked) and I did not watch it for very long, it would likely not recommend such content in future.

The trouble comes when the user shows this type of “engagement” for content that is somehow harmful to them. The algorithm has no way of knowing a user’s emotional state (and it would be incredibly creepy if it did), thus it has no way to prevent this apart from harmful content being moderated (which, as has been established, is a can of worms of its own, and that’s ignoring any age verification, which is probably desired by such parents that refuse to take the time to so much as have a meaningful discussion with their children).

Yes, the events that occurred are tragic, but I feel the blame is just being pushed on “Big Tech” rather than where it should go – to the Tory austerity policies and the resulting criminal underfunding of the National Health Service. Not just its mental health arm, the entire National Health Service. Its dying (example: There is a gigantic backlog of people waiting for treatment. Ambulance wait times are measured in hours, even in an area with low wait times.

Perhaps the money wasted on this coroner’s verdict could have been better spent elsewhere, parliament?

Anonymous Coward says:


That’s exactly how a seach engine works.

Just because it’s tailored to how humans behave doesn’t make it harmful. Unethical, I might agree, but harmful?

Note that it’s from Britain. Guess which ACTUALLY HARMFUL MEDIA EMPIRE makes a ton of money there?

We can talk about the ethics of using algorithms to game human attention rigjt after we destroy Rypert Murdoch’s media empire.

Anonymous Coward says:

That’s exactly how a seach engine works.

Just because it’s tailored to how humans behave doesn’t make it harmful. Unethical, I might agree, but harmful?

Note that it’s from Britain. Guess which ACTUALLY HARMFUL MEDIA EMPIRE makes a ton of money there?

We can talk about the ethics of using algorithms to game human attention right after we destroy Rypert Murdoch’s media empire.

Anonymous Coward says:

Parents should parent their children, not give them adult toys like the internet, way to early. That might solve these types of problems.

Singapore, Malaysia, China–it’s nearly unheard of for kids to spend their lives online as they do in western nations. Kids have too many books to read, and other things to learn, like science, mathematics, language and so on.

Beyond first world problems, these types of teen suicides are indicative of something much darker–an existential crisis of culture.

Anonymous Coward says:


Singapore, Malaysia, China–it’s nearly unheard of for kids to spend their lives online as they do in western nations. Kids have too many books to read, and other things to learn, like science, mathematics, language and so on.

You clearly have not lived in these countries.

Yes, parents use iPhones and the like to distract the kids. We still don’t know how the fuck the kids are not affected, but the state media is geared towards propaganda, smear campaigns against non-PAP political parties and low-key shilling for China.

Add Your Comment

Your email address will not be published. Required fields are marked *

Have a Techdirt Account? Sign in now. Want one? Register here

Comment Options:

Make this the or (get credits or sign in to see balance) what's this?

What's this?

Techdirt community members with Techdirt Credits can spotlight a comment as either the "First Word" or "Last Word" on a particular comment thread. Credits can be purchased at the Techdirt Insider Shop »

Follow Techdirt

Techdirt Daily Newsletter

Techdirt Deals
Techdirt Insider Discord
The latest chatter on the Techdirt Insider Discord channel...