Santana Boulton's Techdirt Profile

Santana Boulton

About Santana Boulton

Posted on Techdirt - 8 August 2024 @ 04:30pm

Age-Gating Access To Online Porn Is Unconstitutional

Texas is one of eight states that have enacted laws that force adults to prove their age before accessing porn sites. Soon it will try to persuade the Supreme Court that its law doesn’t violate the First Amendment. 

Good luck with that. 

These laws are unconstitutional: They deny adults the well-established right to access constitutionally protected speech.

Texas’ H.B. 1181 forces any website made up of one-third or more adult content to verify every visitor’s age. Some adult sites have responded to the law by shutting down their services in Texas. The Free Speech Coalition challenged the law on First Amendment grounds, arguing that mandatory age verification does more than keep minors away from porn — the law nannies adults as well, barring them from constitutionally protected speech. 

The district court agreed with the challengers. Laws regulating speech because of its content (i.e., because it is sexually explicit) are presumed invalid. Under strict scrutiny, the state must show that its regulation is narrowly tailored to serve a compelling government interest. In other words, the government needs an exceptionally good reason to regulate, and it can’t regulate more speech than necessary. 

The case will turn on what level of scrutiny applies. Protecting minors from obscene speech is a permissible state interest, as the Fifth Circuit court established when it applied the lowest form of scrutiny — rational basis review — to uphold the law. But not all speech that is obscene to minors is obscene to adults. Judge Higginbotham, dissenting from the Fifth Circuit’s decision, pointed out that kids might have no right to watch certain scenes from Game of Thrones — but adults do.

In previous cases regulating minors’ access to explicit content, the Supreme Court applied strict scrutiny specifically because the laws restricted adult access to protected speech. Texas hopes to get around decades of precedent by arguing that there is no way that age verification “could reasonably chill adults’ willingness” to visit porn sites. If adults don’t care about age verification, Texas reasons, nothing in the law stops them from viewing sexually explicit material: No protected speech is regulated. 

There’s just one problem: Adults do care about age verification.

H.B. 1181 bars age verification providers from retaining “identifying” information. But nothing in the law stops providers from sharing that same info, and people are rightly concerned about whether their private sexual desires will stay private. That you visited an adult site is bad enough. Getting your personal Pornhub search history leaked along with your government ID is enough to make even the most shameless person consider changing their name and becoming a hermit. 

Texas swears up and down that age verification tech is secure, but that doesn’t inspire confidence in anyone following cybersecurity news. Malware is out there. Data leaks happen. 

A bored employee glancing at your driver’s license as you walk into the sex shop is not the same thing as submitting to a biometric face scan and algorithmic ID verification, by order of the government, before you can press play on a dirty video. Just thinking about it kills the mood, which may be part of the point. 

Texas pretends there’s no difference between the bored bouncer and biometric scans, but if you knew the bouncer had an encyclopedic, inhuman ability to remember every name and face that came through the door and loose lips, well, you wouldn’t go there either. 

Hand-waving away these differences is the kind of thing you only do if you’re highly ideologically motivated. But normal people are very reasonably concerned about whether their personal sexual preferences will be leaked to their boss, mother-in-law, or fellow citizens. Mandatory age verification turns people off of viewing porn entirely, and it chills their free expression. 

Sexual preferences are private and sensitive; they’re exactly the type of thing you don’t want leaking. So, of course, sexual content is a particularly juicy target for would-be hackers and extortionists. People pay handsomely to keep “sextortion” quiet. If you’re worried about your privacy and you don’t trust the age verification software (you shouldn’t), you’re likely to avoid the risk up front. One adult site says only 6% of visitors go through age verification and that even fewer succeed. Thus the chilling effect: even though adult access to porn is technically legal, people are so afraid of having their ID and last watched video plastered across the internet that they stop watching in the first place. 

If the Supreme Court recognizes this and applies strict scrutiny, it will ask whether less restrictive means could protect minors. Back in 2004, the Court tossed out COPA, a law requiring credit card verification to access sexually explicit materials, reasoning that blocking and filtering software would protect minors without burdening adult speech. Today’s filtering software is far more effective than what was available twenty years ago — as the district court found — and, notably, filtering software doesn’t scan adults’ faces. 

Sex — a “subject of absorbing interest to mankind,” as one justice once put it — matters. Adults have the right to sexually explicit speech, free of the fear that their identifying information will be leaked or sent to the state. Texas can and should seek to protect kids without stoking that fear. 

Santana Boulton is a legal fellow at TechFreedom and a Young Voices contributor. Her commentary has appeared in TechDirt. Follow her on X: @santanaboulton

Posted on Techdirt - 18 March 2024 @ 12:31pm

Moderating Eating Disorder Content Is Harder Than You Think

Both troubled teens and government agencies are asking, “How thin is thin enough?” The teens are thinking about how thin they want to look, while the government is thinking about what’s too thin to post online.

The refrain is always the same: the platforms need to do morenever mind the difficult details. Platforms need to remove posts that encourage eating disorders—but leave space for people to talk about recovery. They need to increase moderation. But they’re not supposed to use AI, which is fallible, easily gamed, and potentially illegal. And they’re not supposed to use more human moderators, who will be traumatized by their experiences.

Content moderation is impossible to do well at scale. Moderation around eating disorder content is especially fraught. And there’s no way to eliminate ED content online without broadly shutting down entire parts of the Internet.

Newcomers to this topic might think it’s easy to tell whether a particular post qualifies as eating disorder content. Is there a picture of a starving girl or not? We think we know it when we see it. And this is true sometimes: plenty of ED content openly identifies as such, usually in order to cordon itself off from the rest of the Internet.

An account bio might read, “edtwt, minors dni”—shorthand for “eating disorder Twitter, minors do not interact.” The account might also include that they’re pro-recovery, post or do not post fatphobia, their favorite K-pop group, whether they go to the gym or not, what kind of eating disorder they have, and what kind of fashion they like.

Behind these accounts are individuals who are complex, imperfect, and hurting. And this content has a context. A post that’s so obviously eating disorder content on this account may not obviously eating disorder content posted elsewhere. To be “pro-recovery” is to want to (or be open to) recover from your eating disorder. The person behind an account may be in therapy or receiving outpatient treatment. Both accounts dedicated to eating disorders and accounts that do not solely focus on eating disorders but still have individuals struggling with them may discuss their experiences with the disorder and recovery. Platforms cannot simply ban discussions about eating disorders without sweeping away plenty of honest conversations about mental health, which might also drive the conversation towards less helpful corners of the internet.

And the platforms probably can’t ban fatphobia, at least not in a way that would take care of accounts that use fatphobia to encourage disordered behavior. If an ED account posts a video of a person with a fatphobic caption, platforms can delete the ED account, but the platform shouldn’t delete the original video just because it could be harmful in the wrong hands. The same goes for pictures of people who happen to be skinny: the distinction between lean and clearly anorexic is not as clear-cut as you might like to imagine. We could have everyone register their BMIs and daily caloric intake with their username and password, to be sure that no unhealthy daily outfit pictures slip past the mods, but short of that appalling dystopia, we’re out of luck.

This is all a bit hypothetical and ridiculous, because after all, platforms currently host plenty of self-identified ED content. The platforms could start with banning that.

And once platforms start banning openly disordered content, ED accounts would quickly stop being open about it. This isn’t speculative. These accounts have all sorts of alternative names and phrases invented specifically to evade content moderation. They can invent ED dog whistles much faster than platforms can ban them. Or these ED accounts can create a new account that doesn’t post eating disorder content at all—only borderline content: Kate Moss, particularly thin K-pop idols, girls who might have an eating disorder but might just be really into yoga. Sometimes you know it when you see it, but sometimes you don’t. When is posting a picture a pack of gum ED content and when is it just posting a picture of a pack of gum?

Proposals and lawsuits that aim to make platforms liable for the harmful eating disorder content are asking for the impossible. The best outcome they could hope for is that platforms crack down on the explicitly rule-breaking content and that ED content gets a bit more covert. Because it’s not going away. Companies have been removing ED content since 2001. Eating disorders are older than social media, and advocates who think platforms can moderate ED content out of existence understand neither eating disorders nor content moderation.

No one is saying platforms should do nothing. They should ban accounts and remove content that violates platform rules. And platforms should further develop their current tools. They should prompt users to reach out to helplines or to take breaks. They should allow users to widely mute certain topics and ads. But platforms will never be able to remove all eating disorder content. Nor will they be able to remove all users with eating disorders. Nor should they, even if they could.

Social media cannot solve eating disorders. They can and should aim to host better conversations in general, but we should stop expecting them to moderate mental health problems away.

Santana Boulton is a legal fellow with TechFreedom