PACT Act Is Back: Bipartisan Section 230 'Reform' Bill Remains Mistargeted And Destructive
from the second-verse,-same-as-the-first dept
Last summer we wrote about the PACT Act from Senators Brian Schatz and John Thune — one of the rare bipartisan attempts to reform Section 230. As I noted then, unlike most other 230 reform bills, this one seemed to at least come with good intentions, though it was horribly confused about almost everything in actual execution. If you want to read a truly comprehensive takedown of the many, many problems with the PACT Act, Prof. Eric Goldman’s analysis is pretty devastating and basically explains how the drafters of the bill tried to cram in a bunch of totally unrelated things, and did so in an incredibly sloppy fashion. As Goldman concludes:
This bill contains a lot of different policy ideas. It adds multiple disclosure obligations, regulates several aspects of sites? editorial processes, makes three different changes to Section 230, and asks for two different studies. Any one of these policy ideas, standing alone, might be a significant policy change. But rather than proposing a narrow and targeted solution to a well-identified problem, the drafters packaged this jumble of ideas together to create a broad and wide-ranging omnibus reform proposal. The spray-and-pray approach to policymaking betrays the drafters? lack of confidence that they know how to achieve their goals.
Daphne Keller also has a pretty thorough explanation of problems in the original — noting that the bill contains some ideas that seem reasonable, but often seems sorely lacking in important details or recognition of the complexity involved.
And, to their credit, staffers working on the bill did seem to take these and other criticisms at least somewhat seriously. They reached out to many of the critics of the PACT Act (including me) to have fairly detailed conversations about the bill, its problems, and other potential approaches. Unfortunately, in releasing the new version today, it does not appear that they took many of those criticisms to heart. Instead, they took the same basic structure of the bill and just played around at the margins, leaving the new bill a problematic mess, though a slightly less problematic mess than last year’s version.
The bill still suffers from the same point that Goldman made originally. It throws a bunch of big (somewhat random) ideas into one bill, with no clear explanation of what problem it’s actually trying to solve. So it solves for things that are not problems, and calls other things problems that are not clearly problems, while creating new problems where none previously existed. That’s disappointing to say the least.
And writing an “acceptable use” policy that “reasonably informs users about the types of content that are allowed on the interactive computer service” is a fool’s errand. Because what is and what is not acceptable depends on many, many variables, including context. Just by way of example, many websites famously felt differently about having Donald Trump on their platform before and after the January 6th insurrection at the Capitol. Do we all need to write into our AUPs that such-and-such only applies if you don’t encourage insurrection? As we’ve pointed out a million times, content policy involves constant changes to your policies as new edge cases arise.
People who have never done any content moderation seem to assume that most cases are obvious and maybe you have a small percentage of edge cases. But the reality is often the opposite. Nearly every case is an edge case, and every case involves different context or different facts, and no “Acceptable Use Policy” can possibly cover that — which is why big companies are changing their policies all the time. And for smaller sites? How the fuck am I supposed to create an Acceptable Use Policy for Techdirt? We’re quite open with our comments, but we block spam, and we have our comment voting system — so part of our Acceptable Use Policy is “don’t write stuff that makes our users think you’re an asshole.” Is that what Schatz and Thune want?
The bill then also requires this convoluted notice-takedown-appeal process for content that violates our AUP. But how the hell are we supposed to do that when most of the moderation takes place by user voting? Honestly, we’re not even set up to “put back” content if it has been voted trollish by our community. We’d have to re-architect our comments. And, the only people who are likely to complain… are the trolls. This would enable trolls to keep us super busy having to respond to their nonsense complaints. The bill, like its original version, requires “live” phone-in support for these complaints unless you’re a “small business” or an “individual provider.” But, the terms say that you’re a small business if you “received fewer than 1,000,000 unique monthly visitors” and that’s “during the most recent 12-month period.” How do they define “unique visitors”? The bill does not say, and that’s just ridiculous, as there is no widely accepted definition of a unique monthly visitor, and every tracking system I’ve seen counts it differently. Also, does this mean that if you receive over 1 million visitors once in a 12-month period you no longer qualify?
Either way, under this definition, it might mean that Techdirt no longer qualifies as a small business, and there’s no fucking way we can afford to staff up a live call center to deal with trolls whining that the community voted down their trollish comments.
This bill basically empowers trolls to harass companies, including ours. Why the hell would Senator Schatz want to do that?!?
The bill also requires transparency reports from companies regarding the moderation they do, though it says they only have to come out twice a year instead of four times. As we’ve explained, transparency is good, and transparency reports are good — but mandated transparency reports are huge problem.
For both of these, it’s unclear what exactly is the problem that Schatz and Thune think they’re solving. The larger platforms — the ones that everyone talks about — basically do all of this already. So it won’t change anything for them. All it will do is harm smaller companies, like ours, by putting a massive compliance burden on us, accomplishing nothing but… helping trolls annoy us.
The next big part of the bill involves “illegal content.” Again, it’s not at all clear what problem this is solving. The issue that the drafters of the bill would likely highlight is that some argue that there’s a “loophole” in Section 230: if something is judged to be violating a law, Section 230 still allows a website to keep that content up. That seems like a problem… but only if you ignore the fact that nearly every website will take down such content. The “fix” here seems only designed to deal with the absolute worst actors — almost all of which have already been shut down on other grounds. So what problem is this actually solving? How many websites are there that won’t take down content upon receiving a court ruling on its illegality?
Also, as we’ve noted, we’ve already seen many, many examples of people faking court orders or filing fake defamation lawsuits against “John Does” who magically show up the next day to “settle” in order to get a court ruling that the content violated the law. Enabling more such activity is not a good idea. The PACT Act tries to handwave this away by giving the companies 4 days (in the original version it was 24 hours) to investigate and determine if they have “concerns about the legitimacy of the notice.” But, again, that fails to take reality into account. Courts have no realistic time limit on adjudicating legality, but websites will have to review every such complaint in 4 days?!
The bill also expands the exemptions for Section 230. Currently, federal criminal law is exempt, but the bill will expand that to federal civil law as well. This is to deal with complaints from government agencies like the FTC and HUD and others who worried that they couldn’t take civil action against websites due to Section 230 (though, for the most parts, the courts have held that 230 is not a barrier in those cases). But, much more problematic is that it extends the exemption for federal law to state Attorneys General to allow them to seek to enforce those laws if their states have comparable laws. That is a potentially massive change.
State AGs have long whined about how Section 230 blocks them from suing sites — but there are really good reasons for this. First of all, state AGs have an unfortunate history of abusing their position to basically shake down companies that haven’t broken any actual law, but where they can frame them as doing something nefarious… just to get headlines that help them seek higher office. Giving them more power to do this is immensely problematic — especially when you have industry lobbyists who have capitalized on the willingness of state AGs to act this way, and used it as a method for hobbling competitors. It’s not at all clear why we should give state AGs more power over random internet companies, when their existing track record on these issues is so bad.
Anyway, there is still much more in the bill that is problematic, but on the whole this bill repeats all of the mistakes of the first — even though I know that the drafters know that these demands are unrealistic. The first time may have been due to ignorance, but this time? It’s hard to take Schatz and Thune seriously on this bill when it appears that they simply don’t care how destructive it is.