Why SESTA Is Such A Bad Bill

from the so-much-damage dept

We’ve been talking quite a bit about SESTA — the Stop Enabling Sex Traffickers Act — and why it’s so problematic, but with hearings today, I wanted to dig in a bit more closely with the text to explain why it’s so problematic. There are a large number of problems with the bill, so let’s discuss them one by one.

Undermines the incentives to moderate content and to work with law enforcement:

This remains the biggest issue for me: the fact that the bill is clearly counterproductive to its own stated goals. When people talk about CDA 230, they often (mistakenly) only talk about CDA 230(c)(1) — which is the part that says sites are immune from liability. This leads many people to (again, mistakenly) claim that the only thing CDA 230 is good for is absolving platforms from doing any moderation at all. But this actually ignores the equally important part of the same section: CDA 230(c)(2) which explicitly encourages platforms to moderate “objectionable” content, by noting that good faith efforts to moderate and police that content have no impact on your protection from liability in part (1).

In other words: as currently stated, CDA 230 says that you’re encouraged to moderate your platform and takedown bad content, because there’s no increase in legal liability if you do so. Indeed, it’s difficult to find a single internet platform that does zero moderation. Most platforms do quite a bit of moderation, because otherwise their platforms would be overrun by spam. And, if they want people to actually use their platforms, nearly every site (even those like 4chan) tend to do significant moderation out of public pressure to keep certain content off. Yet, under SESTA you now face liability if you are shown to have any “knowledge” of violations of federal sex trafficking laws. But what do they mean by “knowledge”? It’s not at all clear, as it just says “knowledge.” Thus, now if a site, for example, discovers someone using its platform for trafficking and alerts authorities, that’s evidence of “knowledge” and can be used against them both in criminal charges and in civil lawsuits.

In other words, somewhat incredibly, the incentive here is for platforms to stop looking for any illegal activity on their sites, out of fear of creating knowledge which would make them liable. How does that help? Indeed, platforms will be incentivized not to do any moderation at all, and that will create a mess on many sites.

The vague “knowledge” standard will be abused:

This is sort of a corollary to the first point. The problematic language in the bill is this:

The term ?participation in a venture? means knowing conduct by an individual or entity, by any means, that assists, supports, or facilitates a violation…

But what do they mean by “knowing conduct”? Who the hell knows. We already know that this is going to get litigated probably for decades in court. We have some similar problems in the DMCA’s safe harbors, where there have been legal battles going on many years over whether the standard is “general knowledge” v. “specific knowledge” and what is meant by “red flag knowledge.” And in SESTA the language is less clear. When people have attempted to pin down SESTA’s sponsors on what the standard is for knowledge, they’ve received wildly varying answers, which just means there is no standard, and we’ll be talking about lawsuits for probably decades before it’s established what is meant by “knowledge.” For companies, again, the best way to deal with this is to not even bother doing any moderation of your platform whatsoever, so you can avoid any claim of knowledge. That doesn’t help at all.

The even vaguer “facilitation” language will be massively abused:

In that same definition of “participation in a venture” what may be even more problematic than the vague “knowledge” standard, is the vaguer claim that an entity “by any means, that assists, supports or facilitates a violation…” of sex trafficking laws, meets the standard of “participation in a venture.” All three of those terms have potential problems. Assisting sounds like it requires proactive action — but how do you define it here. Is correcting typos “assisting”? Is having an automated system suggesting keywords “assisting”? Is autocompleting search “assisting”? Because lots of sites do things like that, and it doesn’t give them any actual knowledge of legal violations. How about “supporting”? Again, perfectly benign activities can be seen as “supporting” criminal behavior without the platform being aware of it. Maybe certain features are used in a way that can be seen as supporting. We’ve pointed out that Airbnb could be a target under SESTA if someone uses an Airbnb for sex trafficking. Would the fact that Airbnb handles payment and reviews be seen as “supporting”?

But the broadest of all is the term “facilitating.” That covers basically anything. That’s flat out saying “blame the tool for how it’s used.” Almost any service online can be used to “facilitate” sex trafficking in the hands of sex traffickers. I already discussed Airbnb above, but what about if someone uses Dropbox to host sex trafficking flyers? Or what if a sex trafficker creates advertisements in Google Docs? Or what if a pimp creates a blog on WordPress? What if they use Skype for phone calls? What if they use Stripe or Square for payments? All of those things can be facilitation under this law, and the companies would have no actual knowledge of what’s going on, but would face not only criminal liability but the ability of victims to sue them rather than the actual traffickers.

This is the core problem: this bill targets the tools rather than the law breakers.

Punching a hole in CDA 230 will be abused:

This is one that seems to confuse people who don’t spend much time looking at intermediary liability protections, how they work and how they’ll be abused. It’s completely normal for people in that situation to not recognize how widely intermediary liability is used to stifle perfectly legitimate speech and activity. However, we know damn well from looking at the DMCA, in particular, that when you set up a process by which there might be liability on a platform, it’s regularly abused by people angry about content online to demand censorship. Indeed, we’ve seen people regularly admit that if they see content they dislike, even if there’s no legitimate copyright claim, they’ll “DMCA it” to get it taken down.

Here, the potential problems are much, much worse. Because at least within the DMCA context, you have relatively limited damages (compared to SESTA at least — the monetary damages in the DMCA can add up quickly, but at least its only monetary and it’s limited to a ceiling of $150,000 per work infringed). With SESTA, criminal penalties are much more stringent (obviously) which will create massive incentives for platforms to cave immediately, rather than face the risk of criminal prosecution. Similarly, the civil penalties show no upper bound under the law — meaning the potential monetary penalty may be significantly higher.

The chilling effects of criminal charges:

Combine all of this and you create massive chilling effects for any online platforms — big or small. I already explained earlier why the new incentives will not be to help law enforcement or to moderate content at all, for fear of creating “knowledge” but it’s even worse than that. Because, for many platforms, the massive potential liability from SESTA will mean they don’t create any kind of platform at all. A comment feature on a website would become a huge liability. Any service that might conceivably be used by anyone to “facilitate” sex trafficking creates the potential for serious criminal and civil liability, which should be of great concern. It would likely lead to many platforms not being created at all, just because of the potential liability. For ones that already exist, some may shutter, and others may greatly curtail what the platform allows.

State Attorneys General have a terrible track record on these issues:

In response to the previous point, some may point out (correctly!) that the existing federal law already exempts federal criminal charges — meaning that the DOJ can go after platforms if it finds that they’re actively participating in sex trafficking. But, for as much as we rag on the DOJ, they tend not to be in the business of going after platforms just for the headlines. State AGs, on the other hand, have a fairly long history of doing exactly that — including directly at the behest of companies looking to strangle competitors.

Back in 2010 we wrote about a fairly stunning and eye-opening account by Topix CEO Chris Tolles about what happened when a group of State Attorneys General decided that Topix was behaving badly. Despite the fact they had no legal basis for doing so, they completely ran Topix through the ringer, because it got them good headlines. Here’s just a snippet:

The call with these guys was actually pretty cordial. We walked them through how we ran feedback at Topix, that how in January 2010, we posted 3.6M comments, had our Artificial Intelligence systems remove 390k worth before they were ever even put up, and how we had over 28k feedback emails and 210k user flags, resulting in over 45k posts being removed from the system. When we went through the various issues with them, we ended up coming to what I thought was a set of offers to resolve the issues at hand. The folks on the phone indicated that these were good steps, and that they would circle back with their respective Attorneys? General and get back to us.

No good deed goes unpunished

So, after opening the kimono and giving these guys a whole lot of info on how we ran things, how big we were and that we dedicated 20% of our staff on these issues, what was the response. (You could probably see this one coming.)

That?s right. Another press release. This time from 23 states? Attorney?s General.

This pile-on took much of what we had told them, and turned it against us. We had mentioned that we required three separate people to flag something before we would take action (mainly to prevent individuals from easily spiking things that they didn?t like). That was called out as a particular sin to be cleansed from our site. They also asked us to drop the priority review program in its entirety, drop the time it takes us to review posts from 7 days to 3 and ?immediately revamp our AI technology to block more violative posts? amongst other things.

And, remember, this was done when the AGs had no legal leverage against Topix. Imagine what they would do if they could hold the threat of criminal and civil penalties over the company?

Similarly, remember how leaked Sony emails revealed that the MPAA deliberately set up Mississippi Attorney General Jim Hood with the plan to attack Google (with the letter Hood sent actually being written by MPAA outside lawyers?). If you don’t recall, Hood used claims that, because he was able to find illegal stuff via Google, it meant he could go on a total fishing expedition into how it handled much of its business.

In the Sony leak, it was revealed that the MPAA viewed a NY Times article about the value of lobbying state AGs as a sort of playbook to cultivate “anti-Google” Attorneys General, who it could then use to target and take down companies the MPAA didn’t like (remember, this was what the MPAA referred to, unsubtly, as “Project Goliath”).

Do we really want to empower that same group of AGs with the ability to drag down lots of other platforms with crazy fishing expeditions, just because some angry Hollywood (or other) companies say so?

Opening up civil lawsuits will be abused over and over again:

One of the big problems with SESTA is that it will open up internet companies to getting sued a lot. We already see a bunch of cases every year where people who are upset about certain content online, target lawsuits at those sites just out of anger. The lawsuits tend to get thrown out, thanks to CDA 230, but lawyers keep trying creative ideas to get around CDA 230, adding in all sorts of frivolous attempts. So, for example, after the decision in the Roommates case — in which Roommates.com got dinged for activity not protected by CDA 230 (specifically its own actions that violated fair housing laws) — lots of people cite the Roommates case as an example of why their own argument isn’t killed off by CDA 230.

In other words, if you give private litigants a small loophole to get around CDA 230, they try to jump in and expand it to cover everything. So if SESTA becomes law, you can expect lots of these lawsuits where people will go to great lengths to argue just about any lawsuit is not protected by 230, because of supposed sex trafficking occuring via the site.

Small companies will be hurt most of all:

There’s this weird talking point making the rounds, that the only one really resisting SESTA is Google. We’ve discussed a few times why this is wrong, but let’s face it: of all the companies out there, Google is probably best positioned (along with Facebook) to weather any of this. Both Google and Facebook are used to massive moderation on their platforms. Both companies have built very expensive tools for moderating and filtering content, and both have built strong relationships with politicians and law enforcement. That’s not true for just about everyone else. That means, SESTA would do the most damage to smaller companies and startups, who simply cannot invest the resources to deal with constant monitoring and/or threats from how people use their platform.

Given all of these reasons, it’s immensely troubling that SESTA supporters keep running around insisting that the bill is narrowly tailored and won’t really impact many sites at all. It suggests either a willful blindness to the actual way the internet works (and how people abuse these systems for censorship) or a fairly scary ignorance level, with little interest in getting educated.

Filed Under: , , , , , ,

Rate this comment as insightful
Rate this comment as funny
You have rated this comment as insightful
You have rated this comment as funny
Flag this comment as abusive/trolling/spam
You have flagged this comment
The first word has already been claimed
The last word has already been claimed
Insightful Lightbulb icon Funny Laughing icon Abusive/trolling/spam Flag icon Insightful badge Lightbulb icon Funny badge Laughing icon Comments icon

Comments on “Why SESTA Is Such A Bad Bill”

Subscribe: RSS Leave a comment
37 Comments
aerinai says:

Good luck when 'coded messages' enter the mix

Let’s say that I ran a small web forum that allowed people to buy and sell old electronics equipment. I have a small fan base with a few thousand comments a day. This is more than a single person can moderate, but not enough ad revenue to actually hire anyone. If a few users began to offer to sell ‘televisions’ with ads like ‘2010 36″ Sony Television for $300’ with a picture of a child and the TV, most people wouldn’t think anything of it.

Scenario 1:
Let’s say that this creep was instead pimping out his kid, got caught, and then told the police what he was doing on your site to meet customers.

At that moment, the police come knocking…

Scenario 2:
Let’s say that some rival business/crazy person on the internet (remember the sex ring Clinton had in a pizza place?) decided it didn’t like your site and leaked a ‘tip’ to cops about a potential pimp selling his wares on your site… At that moment, the police come knocking…

In either scenario, you would not have had knowledge. NOW you have knowledge. You have two options: shutter your web site or figure out who the hell is selling TVs and who is a pimp… All I have to say is good luck… Just a shame you have to stop your passion of used electronics swapping down the river because of sex crimes you had nothing to do with.

Anonymous Coward says:

…sounds like a severe lack of faith in representative democracy and American government.

Why do you so distrust the US Senate/Congress to make rational decisions on SESTA ?
Why do you see things so differently than many highly experienced legislators ?

Are not CDA issues a critical responsibility of the Federal Government?

Anonymous Coward says:

Re: Re:

“Why do you so distrust the US Senate/Congress to make rational decisions”

Because of their history

“Why do you see things so differently than many highly experienced legislators”

Because they are legislators, highly experienced in various nefarious endeavors.

“Are not CDA issues a critical responsibility of the Federal Government?”

Is CDA exclusive to federal court proceedings?

Anonymous Coward says:

Civil remedies

Opening up civil lawsuits will be abused over and over again.

From Sec. 3(a)(2)(B) of the bill, amending CDA Section 230(e)

            by adding at the end the following:

“(5) NO EFFECT ON CIVIL LAW RELATING TO SEX TRAFFICKING.—Nothing in this section shall be construed to impair the enforcement or limit the application of section 1595 of title 18, United States Code.”.

(Emphasis.)

Here’s a convenient link for—

18 USC § 1595 – Civil remedy

(a) An individual who is a victim of a violation of this chapter may bring a civil action  . . . .

Ninja (profile) says:

Part of me says “let the chaos ensue”. The impact would be so huge that he backlash would have the rules reverted in record time after the initial flurry of lawsuits and most big companies relocating themselves to safer countries.

It would be awesome to see coordinated responses from tech companies laying all their American workforces and moving abroad.

Gwiz (profile) says:

Re: Re:

Wow. Mike mentions abuse 8 times, but never once in noting that children are literally being abused.

 

And the abusers should be prosecuted to fullest extent of the law. No one has said otherwise.

But that’s not what this bill is about. If you look beyond the grandstanding of "save the children", this bill is about blaming the wrong parties.

It’s like trying to blame Ford for making windowless white vans or Nestle for making candy because they sometimes are used by pedophiles looking for victims.

Travis says:

Re: Re: Re:

Actually, under this law you could.

The term ‘participation in a venture’ means knowing conduct by an individual or entity, by any means, that assists, supports, or facilitates a violation…

If Ford knowingly builds a van, that is then used by a pedo, they are liable. The definition does not say "With intent to", it is the action of building the van that assists the pedo with abducting a kid that triggers the liability.

Any post-signed-into-law limiting of these vague terms would require multiple court cases over the course of several years. This will cost the government, companies, and average Joes millions to fix.

JoeCool (profile) says:

Pretty clear what the REAL goal is

It would likely lead to many platforms not being created at all, just because of the potential liability. For ones that already exist, some may shutter, and others may greatly curtail what the platform allows.

And there it is – the real goal: to shut down the disruptive services enabled by the internet. The old gatekeepers are behind this bill in a clear last-ditch effort to bring down the new industries putting them out of business.

Anonymous Coward says:

Re: Fire in a theatre

That is part of the problem

People that “think” they know shit, but don’t.

or… TD in general. We have the same problem that Congress does. Political Bias that keeps us all stuck on stupid, because survival is greater if you are an attached lying corruption bastard while death is far more imminent if you are an unattached honest person.

Anonymous Coward says:

DOOM! It's teh end of teh internets, I tells ya!

My, you’re hot on this.

"the fact that the bill is clearly counterproductive to its own stated goals." — 1) It’s not a "fact" because you say so, simply feeble propaganda there. 2) It’s false notion that removing privileges from liability for "internet corporations" and letting them be like all others is bad. Those privileges would not exist except that corporations simply BOUGHT "law" tailored for their gain.

"the only thing CDA 230 is good for is absolving platforms from doing any moderation at all." — Yup. You know it, that’s why you deny early. Corporations have made tons of money without responsibility. I hope that era is over.

Gwiz (profile) says:

Re: DOOM! It's teh end of teh internets, I tells ya!

It’s false notion that removing privileges from liability for "internet corporations" and letting them be like all others is bad.

No, your notion is the one that is false.
 

CDA 230 puts internet companies on equal footing with "all others" by not making the tool provider responsible for how the tool gets used.

We don’t make Ford responsible when someone speeds. We don’t make McDonalds responsible because people get fat. We don’t make Smith & Wesson responsible when someone gets shot. We don’t make Exxon responsible when an arsonist uses gasoline to start a fire. Etc, etc, ad nauseam.
 

You are the one who thinks that because it’s an "internet corporation" it should be treated differently than "all others".

Leave a Reply to Mike Masnick Cancel reply

Your email address will not be published. Required fields are marked *

Have a Techdirt Account? Sign in now. Want one? Register here

Comment Options:

Make this the or (get credits or sign in to see balance) what's this?

What's this?

Techdirt community members with Techdirt Credits can spotlight a comment as either the "First Word" or "Last Word" on a particular comment thread. Credits can be purchased at the Techdirt Insider Shop »

Follow Techdirt

Techdirt Daily Newsletter

Ctrl-Alt-Speech

A weekly news podcast from
Mike Masnick & Ben Whitelaw

Subscribe now to Ctrl-Alt-Speech »
Techdirt Deals
Techdirt Insider Discord
The latest chatter on the Techdirt Insider Discord channel...
Loading...