from the guys-it's-still-the-law dept
In the post the other day about Utah trying to ignore Section 230 so it could regulate internet platforms, I explained why it was important that Section 230 pre-empted these sorts of state efforts:
Just think about the impossibility of trying to simultaneously satisfy, in today’s political climate, what a Red State government might demand from an Internet platform and what a Blue State might. That readily foreseeable political catch-22 is exactly why Congress wrote Section 230 in such a way that no state government gets to demand appeasement when it comes to platform moderation practices.
We don’t have to strain our imaginations very hard, because with this lawsuit, by King County, Washington prosecutors against Google, we can see a Blue State do the same thing Utah is trying to do and come after a platform for how it handles user-generated content.
Superficially there are of course some differences between the two state efforts. Utah’s bill ostensibly targets social media posts whereas Washington’s law goes after political ads. What’s wrong with Washington’s law may also be a little more subtle than the abjectly unconstitutional attempt by Utah to trump internet services’ expressive and associative rights. But these are not meaningful distinctions. In both cases it still basically all boils down to the same thing: a state trying to force a platform to handle user-generated content (which online ads generally are) the way the state wants by imposing requirements on platforms that will inevitably shape how they do.
In the Washington case prosecutors are unhappy that Google is apparently not following well enough the prescriptive rules Washington State established to help the public follow the money behind political ads. One need not quibble with the merit of what Washington State is trying to do, which, at least on first glance, seems perfectly reasonable: make campaign finance more transparent to the public. Nor is it necessary to take issue with the specific rules the state came up with to try to vindicate this goal. The rules may or may not be good ones, but whether they are good or not is irrelevant. That there are rules is the problem, and one that that Section 230 was purposefully designed to avoid.
As discussed in that other post, Congress went with an all-carrot, no-stick approach in regulating internet content, giving platforms the most leeway possible to do the best they could to help achieve what Congress wanted overall: the most beneficial and least harmful content online. But this approach falls apart once sticks get introduced, which is why Congress included pre-emption in Section 230 so that states couldn’t try to. Yet that’s what Washington is trying to do with its disclosure rules surrounding political ads: introduce sticks by imposing regulatory requirements that burdens how platforms can facilitate user-generated content, in spite of Congress’s efforts to alleviate them of these burdens.
The burden is hardly incidental or slight. Remember that if Washington could enforce its own rules, then so could any other state or any of locality, even when those rules were far more demanding, or ultimately compromise this or any other worthy policy goal?either inadvertently or even deliberately. Furthermore, even if every state had good rules, the differences between them would likely make compliance unfeasible for even the best-intentioned platform. Indeed, even by the state’s own admission, Google actually had policies aimed at helping the public learn who had sponsored the ads appearing on its services.
Per Google?s advertising policies, advertisers are required to complete advertiser identity verification. Advertisers seeking to place election advertisements through Google?s advertising networks are required to complete election advertisement verification. Google notifies all verified advertisers, including, but not limited to sponsors of election advertisements, that Google will make public certain information about advertisements placed through Google?s advertising networks. Google notifies verified sponsors of election advertisements that information concerning their advertisements will be made public through Google?s Political Advertising Transparency Report.
Google?s policy states:
With the information you provide during the verification process, Google will verify your identity and eligibility to run election ads. For election ads, Google will [g]enerate, when possible, an in-ad disclosure that identifies who paid for your election ad. This means your name, or the name of the organization you represent, will be displayed in the ad shown to users. [And it will p]ublish a publicly available Political Advertising transparency report and a political ads library with data on funding sources for election ads, the amounts being spent, and more.
Google notifies advertisers that in addition to the company?s online Political Advertising Transparency Report, affected election advertisements “are published as a public data set on Google Cloud BigQuery[,]” and that users “can export a subset of the ads or access them programmatically.” Google notifies advertisers that the downloadable election ad “dataset contains information on how much money is spent by verified advertisers on political advertising across Google Ad Services. In addition, insights on demographic targeting used in political advertisement campaigns by these advertisers are also provided. Finally, links to the actual political advertisement in the Google Transparency Report are provided.” Google states that public access to “Data for an election expires 7 years after the election.” [p. 14-15]
Yet Washington is still mad at Google anyway because it didn’t handle user-generated content exactly the way it demanded. And that’s a problem, because if it can sanction Google for not handling user-generated content exactly the way it wants, then (1) so could any other state or any of the infinite number of local jurisdictions Google inherently reaches, (2) to enforce an unlimited number of rules, and (3) governing any sort of user-generated content that may happen to catch a local regulator’s attention. Utah may today be fixated on social media content and Washington State political ads, but once they’ve thrown off the pre-emptive shackles of Section 230 they or any other state, county, city or smaller jurisdiction could go after platforms hosting any of the myriad other sort of expression people use internet services to facilitate.
Which would sabotage the internet Congress was trying to foster with Section 230. Again, Congress deliberately gave platforms a free hand to decide how best to moderate user content so that they could afford to do their best at keeping the most good content up and taking the most bad content down. But with all these jurisdictions threatening to sanction platforms, trying to do either of these things can no longer be platforms’ priority. Instead they will be forced to devote all their resources to the impossible task of trying to avoid a potentially infinite amount of liability. While perhaps at times this regulatory pressure might result in nudging platforms to make good choices for certain types of moderation decisions, it would be more out of coincidence than design. Trying to stay out of trouble is not the same thing as trying to do the best for the public?and often can turn out to be in direct conflict.
Which we can see from Washington’s law itself. In 2018 prosecutors attempted to enforce an earlier version of this law against Google, which led it to declare that it would refuse all political ads aimed at Washington voters.
Three days later, on June 7, 2018, Google announced that the company?s advertising networks would no longer accept political advertisements targeting state or local elections in Washington State. Google?s announced policy was not required by any Washington law and it was not requested by the State. [p. 7]
Prosecutors may have been surprised by Google’s decision, but no one should have been. Such a decision is an entirely foreseeable consequence, because if a law makes it legally unsafe for platforms to facilitate expression, then they won’t.
Even the complaint itself, albeit perhaps inadvertently, makes clear what a loss for discourse and democracy it is when expression is suppressed.
As an example of Washington political advertisements Google accepted or provided after June 4, 2018, Google accepted or provided political advertisements purchased by Strategies 300, Inc. on behalf of the group Moms for Seattle that ran in July 2019, intended to influence city council elections in Seattle. Google also accepted or provided political advertisements purchased by Strategies 300, Inc. on behalf of the Seattle fire fighters that ran in October 2019, intended to influence elections in Seattle. [p. 9]
While prosecutors may frame it as scurrilous that Google accepted ads “intended to influence elections,” influencing political opinion is at the very heart of why we have a First Amendment to protect speech in the first place. Democracy depends on discourse, and it is hardly surprising that people would want to communicate in ways designed to persuade on political matters.
Nor is the fact that they may pay for the opportunity to express it salient. Every internet service needs some way of keeping the lights on and servers running. That it may sometimes charge people to use their systems to convey their messages doesn’t alter the fact that it is still a service facilitating user generated content, which Section 230 exists to protect and needs to protect.
Of course, even in the face of unjust sanction sometimes platforms may try to stick it out anyway, and it appears from the Washington complaint that Google may have started accepting ads again at some point after it had initially stopped. It also agreed to pay $217,000 to settle a 2018 enforcement effort?although, notably, without admitting to any wrongdoing, which is a crucial fact prosecutors omit in its current pleading.
On December 18, 2018, the King County Superior Court entered a stipulated judgment resolving Google?s alleged violations of RCW 42.17A.345 from 2013 through the date of the State?s June 4, 2018, Complaint filing. Under the terms of the stipulated judgment, Google agreed to pay the State $200,000.00 as a civil penalty and an additional $17,000.00 for the State?s reasonable attorneys? fees, court costs, and costs of investigation. A true and correct copy of the State?s Stipulation and Judgment against Google entered by the King County Superior Court on December 18, 2018, is attached hereto as Exhibit B. [p. 8. See p. 2 of Exhibit B for Google expressly disclaiming any admission of liability.]
Such a settlement is hardly a confession. Google could have opted to settle rather than fight for any number of reasons. Even platforms as well-resourced as Google will still need to choose their battles. Because it’s not just a question of being able to afford to hire all the lawyers you may need; you also need to be able to effectively manage them all, and every skirmish on every front that may now be vulnerable if Section 230 no longer effectively preempts those attacks. Being able to afford a fight means being able to afford it in far more ways than just financially, and thus it is hardly unusual for those threatened with legal process to simply try to purchase relief from onslaught instead of fighting for the just result.
Without Section 230, or its preemption provision, however, that’s what we’ll see a lot more of: unjust results. We’ll also see less effective moderation as platforms redirect their resources from doing better moderation to avoiding liability instead. And we’ll see what Google foreshadowed, of platforms withdrawing their services from the public entirely as it becomes financially prohibitive to pay off all the local government entities that might like to come after them. It will not get us a better internet, more innovative online services, or solve any of the problems any of these state regulatory efforts hope to fix. It will only make everything much, much worse.
Filed Under: king county, political ads, pre-emption, section 230, washington