Ninth Circuit Tells Online Services: Section 230 Isn't For You

from the practical-effect dept

Last year we wrote about Homeaway and Airbnb's challenge to an ordinance in Santa Monica that would force them to monitor their Santa Monica listings to ensure they were legally compliant. The Santa Monica ordinance, like an increasing number of ordinances around the country, requires landlords wanting to list their properties on these services to register with the city and meet various other requirements. That part of the ordinance is not what causes concern, however. It may or may not be good local policy, but it in no way undermines Section 230's crucial statutory protection for platforms for Santa Monica officials to attempt to hold their landlord users liable if they go online to say they have a non-compliant rental listing.

The problem with the ordinance is that it does not just impose liability on landlords. It also imposes liability on the platforms hosting their listings. The only way for them to avoid that liability is to engage in the onerous, if not outright impossible, task of scrutinizing whether or not the listings on their platforms are legal. Which is exactly what Section 230 exists to prevent: forcing platforms to monitor their users' speech for legality, because if they had to police them, they would end up facilitating a lot less legitimate speech.

Yet that's what the Ninth Circuit decided to let Santa Monica do – force platforms to monitor their user-generated speech – in a decision earlier this week upholding the district court's refusal to enjoin the ordinance.

Of course, that's not how the court saw it, however. To the court, platforms weren't being forced to police the speech they hosted. They were merely obligated to police the rental transactions they facilitated.

[T]he Ordinance does not require the Platforms to monitor third-party content and thus falls outside of the CDA’s immunity … [T]he only monitoring that appears necessary in order to comply with the Ordinance relates to incoming requests to complete a booking transaction—content that, while resulting from the third party listings, is distinct, internal, and nonpublic. [p. 13-14]

However this is a distinction without a difference.

As we pointed out in the amicus brief the Copia Institute filed in support of Homeaway and Airbnb, these listings are indeed user-generated speech. It may be speech that's extremely limited in scope, little more than "I have housing to rent," but it is still user speech that, per the ordinance, may not always be legal to say. The problem is that this ordinance in effect is all about passing on liability to the platform if they allow this speech to be illegally said, which is no different than trying to pass on liability to a platform for any other speech its users may illegally say.

Yet in its decision the court insisted that platform liability attaches to something entirely apart from its role as a platform facilitating user speech:

Similarly, here, the Ordinance is plainly a housing and rental regulation. The “inevitable effect of the [Ordinance] on its face” is to regulate nonexpressive conduct—namely, booking transactions—not speech. [p. 19-20]

It went on to declare that the ordinance in no way forces platforms to monitor user content:

Contrary to the Platforms’ claim, the Ordinance does not “require” that they monitor or screen [listings]. It instead leaves them to decide how best to comply with the prohibition on booking unlawful transactions. [p. 20]

At every step in its reasoning it kept treating the ordinance as something wholly apart from an ordinance impacting speech:

Nor can the Platforms rely on the Ordinance’s “stated purpose” to argue that it intends to regulate speech. The Ordinance itself makes clear that the City’s “central and significant goal . . . is preservation of its housing stock and preserving the quality and nature of residential neighborhoods.” As such, with respect to the Platforms, the only inevitable effect, and the stated purpose, of the Ordinance is to prohibit them from completing booking transactions for unlawful rentals. [p. 20]

But no amount of handwaving by the court to try to focus on the financial transaction between landlord and renter, or insistence that this ordinance doesn't force platforms to monitor user-generated speech, will change the basic reality that it does indeed force platforms to do exactly that: police user speech for legality in order to avoid liability arising from that speech. It is exactly the sort of situation Section 230 was intended to forestall because of the inevitable chilling effect fear-driven platform monitoring obligations have on online speech and innovation.

The court seemed to try to justify its contorted reasoning by noting that because "brick and mortar" businesses have to comply with all sorts of local regulations, Internet businesses also should have to.

We have consistently eschewed an expansive reading of the statute that would render unlawful conduct “magically . . . lawful when [conducted] online,” and therefore “giv[ing] online businesses an unfair advantage over their real-world counterparts.” For the same reasons, while we acknowledge the Platforms’ concerns about the difficulties of complying with numerous state and local regulations, the CDA does not provide internet companies with a one-size-fits-all body of law. Like their brick-and-mortar counterparts, internet companies must also comply with any number of local regulations concerning, for example, employment, tax, or zoning. [p. 16]

But this thinking fails to recognize the unique differences between brick and mortar businesses and Internet business, differences that help explain why it is so important to give Internet businesses this vital protection. After all, a brick and mortar store only has to comply with the laws of the jurisdiction where the store is located – as Internet platforms also need to, in the finite number of places where they have a physical or corporate presence. But when it comes to their online presence, an Internet business is everywhere and thus theoretically exposed to the laws of every single jurisdiction, no matter how onerous these laws are, or how much they may conflict with any other's.

Because while perhaps the Santa Monica ordinance may not be too onerous for the platforms to comply with in and of itself, Santa Monica is but one city, yet the Ninth Circuit has now given the green light to every other city in every other state to come up with their own ordinances that will similarly force platforms to monitor user content. As Congress feared in 1996 when it passed Section 230, this decision now invites platforms to divert resources better spent elsewhere, overly censor user speech, withdraw from entire markets – even those that might prefer to have these services available – or risk being bankrupted by an infinite number of local jurisdictions pulling them in every possible direction.

This result is chilling not just to these platforms but to any other innovative service, especially if the service has any effect in the offline world, as so many do, or facilitates economic transactions between users, as so many also do. If bearing these indicia are enough to cause a platform to lose its Section 230 protection, then few will be able to retain it.

Filed Under: 9th circuit, california, cda 230, housing, intermediary liability, internet, platforms, santa monica, section 230, speech
Companies: airbnb, homeaway


Reader Comments

Subscribe: RSS

View by: Time | Thread


  1. icon
    Anonymous Anonymous Coward (profile), 15 Mar 2019 @ 5:13pm

    Re: Re: Margarine melts at 110𝆩, and butter at 98.6𝆩, feel

    I think that platform liability does relate, and is comparable. They follow the laws put out for them, and those are Federal laws (assuming they are based in the US) because they are interstate rather than local laws.

    The difference between some platform checking the registration or title of a car for sale and verifying that the seller was actually the owner is a lot different than a platform reading a lease or sales agreement (assuming those are actually available) and determining legality is huge. The former is possibly possible with some limitations such as the seller telling the truth about their identity. The latter is a question that is not actually discernible by a platform, without a court.


Add Your Comment

Have a Techdirt Account? Sign in now. Want one? Register here



Subscribe to the Techdirt Daily newsletter




Comment Options:

  • Use markdown. Use plain text.
  • Remember name/email/url (set a cookie)

Follow Techdirt
Techdirt Gear
Shop Now: Copying Is Not Theft
Advertisement
Report this ad  |  Hide Techdirt ads
Essential Reading
Techdirt Deals
Report this ad  |  Hide Techdirt ads
Techdirt Insider Chat
Advertisement
Report this ad  |  Hide Techdirt ads
Recent Stories
Advertisement
Report this ad  |  Hide Techdirt ads

Close

Email This

This feature is only available to registered users. Register or sign in to use it.