Sextortion Is A Real & Serious Criminal Issue; Blaming Section 230 For It Is Not

from the stay-focused-here dept

Let’s say I told you a harrowing story about a crime. Criminals from halfway around the world used fraudulent means and social engineering to scam a teenager, causing them to effectively destroy their lives (at least in the mind of the teen). The person whose life was destroyed then took an easily accessible gun from their parent and shot and killed themselves. Law enforcement investigated the crime, tracked down the people responsible, extradited them to the US and tried them. Eventually, they were sentenced to many years in prison.

Who would you blame for such a thing?

Apparently, for some people, the answer is Section 230. And it makes no sense at all.

That, at least, is the takeaway from an otherwise harrowing, distressing, and fascinating article in Bloomberg Businessweek about the very real and very serious problem of sextortion.

The article is well worth reading, as it not only details the real (and growing) problem of sextortion, but shows how a momentary youthful indiscretion — coaxed by a skillful social engineer — can destroy someone’s life.

The numbers on sextortion are eye-opening:

It was early 2022 when analysts at the National Center for Missing & Exploited Children (NCMEC) noticed a frightening pattern. The US nonprofit has fielded online-exploitation cybertips since 1998, but it had never seen anything like this.

Hundreds of tips began flooding in from across the country, bucking the trend of typical exploitation cases. Usually, older male predators spend months grooming young girls into sending nude photos for their own sexual gratification. But in these new reports, teen boys were being catfished by individuals pretending to be teen girls—and they were sending the nude photos first. The extortion was rapid-fire, sometimes occurring within hours. And it wasn’t sexually motivated; the predators wanted money. The tips were coming from dozens of states, yet the blackmailers were all saying the same thing:

“I’m going to ruin your life.”

“I’m going to make it go viral.”

“Answer me quickly. Time is ticking.”

“I have what I need to destroy your life.”

As the article details, there is something of a pattern in many of these sextortion cases. There are even “training” videos floating around that teach scammers how to effectively social engineer the result: get control over an Instagram or Snapchat account of a young girl and start friending/flirting with teen boys.

After getting flirty enough, send a fake nude and ask for one in return. Then, the scammer goes straight into extortion mode the second the teen boy does the teen boy thing and sends a compromising photo, focused on promising to ruin the boy’s life:

Around midnight, Dani got flirtatious. She told Jordan she liked “playing sexy games.” Then she sent him a naked photo and asked for one in return, a “sexy pic” with his face in it. Jordan walked down the hallway to the bathroom, pulled down his pants and took a selfie in the mirror. He hit send.

In an instant, the flirty teenage girl disappeared.

“I have screenshot all your followers and tags and can send this nudes to everyone and also send your nudes to your family and friends until it goes viral,” Dani wrote. “All you have to do is cooperate with me and I won’t expose you.”

Minutes later: “I got all I need rn to make your life miserable dude.”

As the article notes, this is part of the “playbook” that is used to teach the scammers:

The Yahoo Boys videos provided guidance on how to sound like an American girl (“I’m from Massachusetts. I just saw you on my friend’s suggestion and decided to follow you. I love reading, chilling with my friends and tennis”). They offered suggestions for how to keep the conversation flowing, how to turn it flirtatious and how to coerce the victim into sending a nude photo (“Pic exchange but with conditions”). Those conditions often included instructions that boys hold their genitals while “making a cute face” or take a photo in a mirror, face included.

Once that first nude image is sent, the script says, the game begins. “NOW BLACKMAIL 😀!!” it tells the scammer, advising they start with “hey, I have ur nudes and everything needed to ruin your life” or “hey this is the end of your life I am sending nudes to the world now.” Some of the blackmail scripts Raffile found had been viewed more than half a million times. One, called “Blackmailing format,” was uploaded to YouTube in September 2022 and got thousands of views. It included the same script that was sent to Jordan DeMay—down to the typos.

The article mostly focuses on the tragic case of one teen, DeMay, who shot himself very soon after getting hit with this scam. The article notes, just in passing, that DeMay had access to his father’s gun. Yet, somehow, guns and easy access to them are never mentioned as anything to be concerned about, even as the only two suicides mentioned in the article both involve teen boys who seemed to have unsupervised access to guns with which to shoot themselves.

Apparently, this is all the fault of Section 230 instead.

Hell, even as the article describes how this was a criminal case, and (somewhat amazingly!) the FBI tracked down the actual scammers in Nigeria, had them extradited to Michigan, and even got them to plead guilty to the crime (along with a mandatory minimum of 15 years in prison). Apparently, this is still… an internet problem?

The reality is that this is a criminal problem, and it’s appropriate to treat it as such, where law enforcement has to deal with it (as they did in this case).

It seems like there are many things to blame here: the criminals themselves (who are going to prison for many years), the easy access to guns, even the failure to teach kids to be careful with who they’re talking to or what to do if they got into trouble online. But, no, the article seems to think this is all Section 230’s fault.

DeMay’s family appears to have been suckered by a lawyer into suing Meta (the messages to him came via Instagram):

In January, Jordan’s parents filed a wrongful death lawsuit in a California state court accusing Meta of enabling and facilitating the crime. That month, John DeMay flew to Washington to attend the congressional hearing with social media executives. He sat in the gallery holding a picture of Jordan smiling in his red football jersey.

The DeMay case has been combined with more than 100 others in a group lawsuit in Los Angeles that alleges social media companies have harmed children by designing addictive products. The cases involve content sent to vulnerable teens about eating disorders, suicide and dangerous challenges leading to accidental deaths, as well as sextortion.

“The way these products are designed is what gives rise to these opportunistic murderers,” says Matthew Bergman, founder of the Seattle-based Social Media Victims Law Center, who’s representing Jordan’s parents. “They are able to exploit adolescent psychology, and they leverage Meta’s technology to do so.”

Except all of that is nonsense. Yes, sextortion is problematic, but what the fuck in the “design” of Instagram aids it? It’s a communication tool, like any other. In the past, people used phones and the mail service for extortion, and no one sued AT&T or the postal service because of it. It’s utter nonsense.

But Bloomberg runs with it and implies that Section 230 is somehow getting in the way here:

The lawsuits face a significant hurdle: overcoming Section 230 of the Communications Decency Act. This liability shield has long protected social media platforms from being held accountable for content posted on their sites by third parties. If Bergman’s product liability argument fails, Instagram won’t be held responsible for what the Ogoshi brothers said to Jordan DeMay.

Regardless of the legal outcome, Jordan’s parents want Meta to face the court of public opinion. “This isn’t my story, it’s his,” John DeMay says. “But unfortunately, we are the chosen ones to tell it. And I am going to keep telling it. When Mark Zuckerberg lays on his pillow at night, I guarantee he knows Jordan DeMay’s name. And if he doesn’t yet, he’s gonna.”

So here’s a kind of important question: how would this story have played out any differently in the absence of Section 230? What different thing would Mark Zuckerberg do? I mean, it’s possible that Facebook/Instagram wouldn’t really exist at all without such protections, but assuming they do, what legal liability would be on the platforms for this kind of thing happening?

The answer is nothing. For there to be any liability under the First Amendment, there would have to be evidence that Meta employees knew of the specific sextortion attempt against DeMay and did nothing to stop it. But that’s ridiculous.

Instagram has 2 billion users. What are the people bringing the lawsuit expecting Meta to do? To hire people to read every direct message going back and forth among users, spotting the ones that are sextortion, and magically stepping in to stop them? That’s not just silly, it’s impossible and ridiculously intrusive. Do you want Meta employees reading all your DMs?

Even more to the point, Section 230 is what allows Meta to experiment with better solutions to this kind of thing. For example, Meta has recently announced new tools to help fight sextortion by using nudity detectors to try to prevent kids from sending naked photos of themselves.

Developing such a tool and providing such help would be riskier without Section 230, as it would be an “admission” that people use their tools to send nudes. But here, the company can experiment with providing better tools because of 230. The focus on blaming Section 230 is so incredibly misplaced that it’s embarrassing.

The criminals are actually responsible for the sextortion scam and the end results, and possibly whoever made it so damn easy for the kid to get his father’s gun in the middle of the night to shoot himself. The “problem” here is not Section 230, and removing Section 230 wouldn’t change a damn thing. This lawsuit is nonsense, and sure, maybe it makes the family feel better to sue Meta, but just because a crime happened on Instagram, doesn’t magically make it Instagram’s fault.

And, for good reason. As noted above, this was always a law enforcement situation. We shouldn’t ever want to turn private companies into law enforcement. Because that would be an extremely dangerous result. Let Meta provide its communications tools. Let law enforcement investigate crimes and bring people to justice (as happened here). Maybe we should focus on better educating our kids to be aware of threats like sextortion and how to respond to it if they happen to make a mistake and get caught up in it.

There’s lots of blame to go around here, but none of it belongs on Section 230.

Filed Under: , , , , , , , , ,
Companies: meta

Rate this comment as insightful
Rate this comment as funny
You have rated this comment as insightful
You have rated this comment as funny
Flag this comment as abusive/trolling/spam
You have flagged this comment
The first word has already been claimed
The last word has already been claimed
Insightful Lightbulb icon Funny Laughing icon Abusive/trolling/spam Flag icon Insightful badge Lightbulb icon Funny badge Laughing icon Comments icon

Comments on “Sextortion Is A Real & Serious Criminal Issue; Blaming Section 230 For It Is Not”

Subscribe: RSS Leave a comment
29 Comments
This comment has been deemed insightful by the community.
Whoever says:

It's not just kids who are victims -- also adults who should know better.

https://www.theguardian.com/politics/2024/apr/04/senior-tory-mortified-after-reportedly-passing-mps-data-to-dating-app-contact

In the UK an MP was extorted into providing private phone numbers of some of his colleagues to the perpetrator.

This comment has been deemed insightful by the community.
That One Guy (profile) says:

IF Online THEN Blame 230

The issue is that 230 has been used as such a universal boogieman by liars for so long that anytime something bad happens online the kneejerk reaction is that 230 must be involved and responsible somehow.

(230 making Steve Dallas lawsuits a lot more difficult might also have something to do with instances like this..)

This comment has been flagged by the community. Click here to show it.

This comment has been flagged by the community. Click here to show it.

Anonymous Coward says:

Re: Re: Re:

Where’s that Techdirt-destroying exposé, Jhon? Surely you’ve got at least MATT DRUDGE to print them out?

Or are you too busy ragecrying in front of Big Daddy Jeff Bezos, pinky-swearing that your latest Amazon-based data harvesting scam is totally a legitimate business and not a means to illegally broker data to the highest bidder?

That Anonymous Coward (profile) says:

I had posted to Twatter some of my thoughts…

The sextortion thing…
A whole bunch of people are going to ignore anything we tell them because the media is all over the “tech will protect you” story of Insta magically stopping teens from sending dick pics.

Here is the most honest thing that can be said to people of any age.

If you send naked content, you need to assume it will be on a billboard in Times Square at some point.

You can claim it is hyperbole but how many more stories of shitty ex’s sending out nudes do you need?

While you giggle at Paris & Kim using it to gain more fame, stop and consider how you would feel in their shoes with anyone able to look & comment, to seek you out and say horrible things to you. How people you don’t even know quietly whispering when they see you.

Scams rely on you freaking out & needing to do something now now now without thinking.

If you pay expect to keep getting farmed for cash, what the person who got your pic by lying is going to delete the content because you paid?

Would you like to buy my bridge in Brooklyn?

I personally get like 3 emails a month where the hacker claims to have hacked me, monitored me, & has webcam video of me jerking off to porn sites and if I don’t pay they will send it to everyone!
Before I scramble to send the payment to a bitcoin wallet they list, I stop.

That email has no contacts to anyone I know.
I don’t have a webcam.
I don’t give a shit if I am seen naked by people I know.
I suspect they haven’t actually hacked me (never been a release of any video of me) but expected me to freak & pay.

Every human has “naughty bits” GASP.
People who would think less of you for someone gaining your trust to rip you off are shitty people.
You are the victim who has been violated & those who want to blame you are assholes.
You got scammed, lots of people have been the same way.

Don’t bother talking to the scammer, they will say anything to get you to send them cash.
Report the account to the platform.
Block the scammer.
Move on with your life.
Learn the lesson that people can and do lie online.
That assholes screw with your heart & your head.

Your parents have seen you naked way more than you should think about outside of therapy so not going to shock them.
People often refuse to admit to these things, which is what helps the scammers stay in business.

Just saying – Someone tried to catfish & threaten me to get cash online but I laughed & blocked them, might help them be prepared.
The internet isn’t a safe place.
People lie.
If you expect to see your naughty pic on the screens in Times Square when you send it less power to scam

You’re horny, you are going to send naked pics.
Way more people do it than will admit it.

Learn to not freak out.

Notice Paris & Kim aren’t exactly pariahs in society & they had way more than naked pics get out there.

This isn’t the end of your world, deep breath, think.

https://twitter.com/That_AC/status/1780749800033358323

Something new I’ve seen popping up in my xhitter feed lately is someone selling nudes & details about young guys (allegedly 18+ because no boy EVER EVER would fib to a hot chick to get tit pics) who were catfished.
I reported the first 1 to support & it went as well as the 200+ slutbot accounts I reported that are still active months later.

Next one I see I’m going to try to remember NCMEC is a thing, I’m guilty of that thing were we assume someone will report it… I forgot I was someone.

Rather than attacking 230, BigTech, lack of prayer in schools as being the cause of these things perhaps we should admit that the fake morality & puritanical beliefs are the reason this works.

Society doesn’t worry about the boy who a female teacher bangs, they cheer him on for getting lucky.

But every other mix & match of genders in that situation is horrific. (Except when it was that pastor guy who groomed & molested a family member from 9 to 14 and has to serve 4 whole months & not have to register as a fucking predator).

HIV is still rocking the planet because aid to developing nations can’t mention condoms or safer sex to appease the moral majority who believe sex outside of marriage should be a death sentence (unless they are their own mistresses).

How many more teens need to cap themselves before we stop pretending someone seeing your dick pic will ruin your entire life forever?? Every man has one (allegedly… eyes congress), some are nicer than others, but people believing your dick pic being out there will harm your future like the Abu Grav prisoner torture pics needs to stop.

Anonymous Coward says:

Re:

Every man has one (allegedly… eyes congress)

All men are men, unless it’s in a discussion with shoeonhead to point out that men are lying about a loneliness epidemic, in which case the transmen are allies and aren’t in the list of people we’d say deserve the loneliness.

In the same way that transwomen are women, unless they’re running on the same track and field.

people believing your dick pic being out there will harm your future like the Abu Grav prisoner torture pics needs to stop

Just don’t have a dick, problem solved. There’s nothing a strapon or dildo can’t do that a dick already could. Except offspring, but honaaaaaaay, you are not having children in this economy or climate or social situation. Even if you’re insane enough to want one, surrogacy exists. Of course the Pope doesn’t like the idea of surrogacy but who the fuck cares what a pasty old pedophile thinks anyway?

Bruce C. says:

Re: Random thoughts...

1) It would be poetic justice if someone posted deepfake nudes (skipping the extortion step) of these perps on the same forums they use to make their threats.

2) (Wishful thinking) This whole problem would go away if we’d stop shaming people about sex and being sexy. Then most people would take AC’s attitude of “publish and be damned” if such extortion came in.

3) In the absence of the miracle of #2, Deepfake AI images are making/going to make this crime explode, even for people who have never posed nude. So as a second best alternative, a general consensus that “all nudes on the internet are fake” would achieve much the same results.

LAN8 says:

Devil's Advocate

So, right from the start I agree this has nothing to do with 230. However, FB could have been designed differently and arguably SHOULD be re-designed to correct for what we now know is real harm done fairly easily to real kids (and other people).

For instance:
– Seeing the profile of someone you don’t have a contact in common with should be hard by default. Let the user open the up (opt in) if they want.
– Being able to comment to someone’s friend’s should have some sort of mute button and it should be reviewed and accepted by the owner of the feed first by default
– Any time a person is mentioned they should have a review of the comment to slow or stop trolling.
– It should be an easy click to turn off comments for a specific post.
– Lots of research has gone into addicting people to “likes” and interrupting them with notifications. Notifications should be off by default and people should be required to post a comment before they can “like” something. That way you’re driving real engagement with a topic rather than tapping into addictive behaviour patterns for a dopamine reward.
– There should be some sort of automated pattern-matching AI algorithm to catch bots and scammers who are targeting people. Something machine before a real person gets involved in looking at user content but escalating to a person to determine if illegal activity is done.

Those are only a very few ideas off the top of my head but along with porting personal information to Solid (Tim Berners-Lee’s new personal information container) might help solve some of these inherent deficiencies in social media that allow trolls to literally kill people. Yes, it means they’re suing to fix these issues and force a company (either public or private) to conform to some standard. But we already do that with some companies (telcos, for instance) and we already know what happens when everyone can buy a gun easily. Now, the internet isn’t a gun, but it’s at least as mighty as the sword. It has caused harm to many individuals and just leaving it “laissez faire” isn’t solving any of the real harms we’re already aware of. In fact companies are sticking to obviously flawed choices in information design that we’re all well aware of at this point because they don’t want to put in the work to change something marketable. Well, maybe that isn’t good enough. There should be curbs for trolls and misinformation campaigns (often from nation states) and also for predators.

There does need to be a balance between freedom of speech (and limiting swearing isn’t doing anything meaningful) and limiting real harms. But limiting harms that cause kids, who are by nature naive and panicky, to commit suicide is something we should attempt. It’s illegal, for instance, to heckle someone into committing suicide. If that’s illegal IRL and people are being prosecuted for it then shouldn’t we have all sorts of barriers in place to make that harder to do? If these scripts are available online couldn’t an AI parser read everything and match the pattern of the script and flag that for possibly intervention? Letting a company get away with making billions while not changing their flawed interventions that really do very little to stop these damaging behaviours (at least so far) seems gross.

Arianity says:

For there to be any liability under the First Amendment, there would have to be evidence that Meta employees knew of the specific sextortion attempt against DeMay and did nothing to stop it. But that’s ridiculous.

I mean, in this specific case, perhaps. But in general, social media companies ignore active signs (such as reports) for conduct all the time. It’s not that ridiculous.

And 230 does protect some of those cases (or lax enforcement/monitoring), just as much as it protects reasonable cases. It makes no distinction, other than not interfering with federal law.

Yes, sextortion is problematic, but what the fuck in the “design” of Instagram aids it?

Being able to DM people (including minors) with no supervision, presumably. There are potential tools between reading DMs, and doing nothing, that are possible. (For instance, the recent tool you just highlighted. Not having that earlier is a part of the design that aids it).

It’s a communication tool, like any other.

Those other communication tools are also just as liable.

In the past, people used phones and the mail service for extortion, and no one sued AT&T or the postal service because of it.

Phones/mail didn’t have the tools to address it the way a digital service does. We absolutely would expect them to, if they had.

Developing such a tool and providing such help would be riskier without Section 230,

That depends entirely on what the replacement for it is. Not all replacements would be riskier. (The same also applies to the above. You’re looking at it assuming 230 is gone and the First Amendment governs, but it’s entirely possible to create new forms of liability)

As noted above, this was always a law enforcement situation.

Law enforcement is never going to be able to as proactive as company is with it’s own service, and we shouldn’t want it to be. It’s entirely possible to push companies to do a better job at this, in a way that complements law enforcement, and it doesn’t make them “law enforcement” to do so. They should have an obligation to design their tools to be as responsible as possible, when it is feasible. Pretending that’s “law enforcement” is ridiculous.

Developing such a tool and providing such help would be riskier without Section 230, as it would be an “admission” that people use their tools to send nudes.

The only problem here is that actively combating it counts as an admission, and negligence doesn’t. Meta shouldn’t get to pretend that it’s not (obviously) very aware that people, including minors, use their tools to send nudes. It’s a legal fiction, they very obviously know that’s happening.

The reality is that this is a criminal problem, and it’s appropriate to treat it as such, where law enforcement has to deal with it (as they did in this case).

It’s both. Treating it as solely a law enforcement problem is inadequate and unnecessary. You can argue how far that should go, of course. There are methods that would be far too intrusive. But it’s also not only law enforcement.

Tim Willingham says:

What if he used a knife instead?

The fixation on his use of a gun is odd since there are any numbers of other implements used to end life by suicide. It comes off as a diversion from the issue which as you correctly stated is not Section 230. But you know prosecutors will use every weapon given them to pursue who they deem a criminal. All you have to do is look at Trump’s various bogus charges to see that.

emanuel celano (user link) says:

We would like to offer our contribution from Italy

We would like to offer our contribution from Italy with the site https://ricattosessuale.it/ (from the menu at the top right you can change the language of the site) where it is possible to delve deeper into the topic of sexual extortion after video chats or sending intimate images, with even free advice from top experts in the sector, with over 14 years of skills acquired in the field. An OBSERVATORY ON SEXUAL BLACKMAIL has been established with news on the topic of sextortion in ITALY AND IN THE WORLD updated every month https://ricattosessuale.it/category/osservatorio-ricatti-sessuali-in-italia/ Furthermore, an important online petition entitled “Online sexual blackmail: we need a COMPUTER EDUCATOR in schools”. This petition requires maximum support to establish a new professional figure in all schools in Italy: “the IT educator”. Responsible for informing and training middle and high school students on the dangers of the internet (not only those linked to sexual extortion) and on the defenses to be put in place to prevent or combat the phenomenon of cyberbullying. You can sign here https://ricattosessuale.it/firma-la-petizione-ricatto-sessuale-online-occorre-un-educatore-informatico-nelle-scuole/

Add Your Comment

Your email address will not be published. Required fields are marked *

Have a Techdirt Account? Sign in now. Want one? Register here

Comment Options:

Make this the or (get credits or sign in to see balance) what's this?

What's this?

Techdirt community members with Techdirt Credits can spotlight a comment as either the "First Word" or "Last Word" on a particular comment thread. Credits can be purchased at the Techdirt Insider Shop »

Follow Techdirt

Techdirt Daily Newsletter

Ctrl-Alt-Speech

A weekly news podcast from
Mike Masnick & Ben Whitelaw

Subscribe now to Ctrl-Alt-Speech »
Techdirt Deals
Techdirt Insider Discord
The latest chatter on the Techdirt Insider Discord channel...
Loading...