If We're Going To Talk About Discrimination In Online Ads, We Need To Talk About Roommates.com
from the deja-vu-all-over-again dept
It has been strange to see people speak about Section 230 and illegal discrimination as if it were somehow a new issue to arise. In fact, one of the seminal court cases that articulated the parameters of Section 230, the Roommates.com case, did so in the context of housing discrimination. It’s worth taking a look at what happened in that litigation and how it bears on the current debate.
Roommates.com was (and apparently remains) a specialized platform that does what it says on the tin: allow people to advertise for roommates. Back when the lawsuit began, it allowed people who were posting for roommates to include racial preferences in their ads, and it did so in two ways: (1) through a text box, where people could write anything about the roommate situation they were looking for, and (2) through answers to mandatory questions about roommate preferences.
Roommates.com got sued by the Fair Housing Councils of the San Fernando Valley and San Diego for violating federal (FHA) and state (FEHA) fair housing law for allowing advertisers to express these discriminatory preferences. It pled a Section 230 defense, because the allegedly offending ads were user ads. But, in a notable Ninth Circuit decision, it both won and it lost.
In sum, the court found that Section 230 indeed applied to the user expression supplied through the text box. That expression, for better or worse, was entirely created by the user. If something was wrong with it, it was the user who had made it wrongful and the user, as the information content provider, who could be held responsible?but not, per Section 230, the Roommates.com platform, which was the interactive computer service provider for purposes of the statute and therefore immune from liability for it.
But the mandatory questions were another story. The court was concerned that, if these ads were illegally discriminatory, the platform had been a party to the creation of that illegality by prompting the user to express discriminatory preferences. And so the court found that Section 230 did not provide the platform a defense to any claim predicated on the content elicited by these questions.
Even though it was a split and somewhat messy decision, the Roommates.com case has held up over the years and provided subsequent courts with some guidance for how to figure out when Section 230 should apply. There are still fights around the edges, but figuring out whether it should apply has basically boiled down to determining who imbued the content with its allegedly wrongful quality. If the platform, then it’s on the hook as much as the user may be. But its contribution to wrongful content’s creation still had to be more substantive than merely offering the user the opportunity to express something illegal.
The fact that Roommate encourages subscribers to provide something in response to the prompt is not enough to make it a “develop[er]” of the information under the common-sense interpretation of the term we adopt today. It is entirely consistent with Roommate’s business model to have subscribers disclose as much about themselves and their preferences as they are willing to provide. But Roommate does not tell subscribers what kind of information they should or must include as “Additional Comments,” and certainly does not encourage or enhance any discriminatory content created by users. Its simple, generic prompt does not make it a developer of the information posted. [p. 1174].
The reason it is so important to hold onto that distinction is because the Roommates.com litigation has a punchline. The case didn’t end there, with that first Ninth Circuit decision. After several more years of litigation there was a another Ninth Circuit decision in the case, this time on the merits of the discrimination claim.
And the claim failed. Per the Ninth Circuit, roommate situations are so intimate that the First Amendment rights of free association must be allowed to prevail and people be able to choose whom they live with by any means they like, even if its xenophobic prejudice.
Because of a roommate’s unfettered access to the home, choosing a roommate implicates significant privacy and safety considerations. The home is the center of our private lives. Roommates note our comings and goings, observe whom we bring back at night, hear what songs we sing in the shower, see us in various stages of undress and learn intimate details most of us prefer to keep private. Roommates also have access to our physical belongings and to our person. As the Supreme Court recognized, “[w]e are at our most vulnerable when we are asleep because we cannot monitor our own safety or the security of our belongings.” Minnesota v. Olson, 495 U.S. 91, 99, 110 S.Ct. 1684, 109 L.Ed.2d 85 (1990). Taking on a roommate means giving him full access to the space where we are most vulnerable. [p. 1221]
Government regulation of an individual’s ability to pick a roommate thus intrudes into the home, which “is entitled to special protection as the center of the private lives of our people.” Minnesota v. Carter, 525 U.S. 83, 99, 119 S.Ct. 469, 142 L.Ed.2d 373 (1998) (Kennedy, J., concurring). “Liberty protects the person from unwarranted government intrusions into a dwelling or other private places. In our tradition the State is not omnipresent in the home.” Lawrence v. Texas, 539 U.S. 558, 562, 123 S.Ct. 2472, 156 L.Ed.2d 508 (2003). Holding that the FHA applies inside a home or apartment would allow the government to restrict our ability to choose roommates compatible with our lifestyles. This would be a serious invasion of privacy, autonomy and security. [id.].
Because precluding individuals from selecting roommates based on their sex, sexual orientation and familial status raises substantial constitutional concerns, we interpret the FHA and FEHA as not applying to the sharing of living units. Therefore, we hold that Roommate’s prompting, sorting and publishing of information to facilitate roommate selection is not forbidden by the FHA or FEHA. [p. 1223]
This ruling is important on a few fronts. In terms of substance, it means that any law that itself tries to ban discrimination may itself have constitutional problems. It may be just, proper, and even affirmatively Constitutional to ban it in many or even most contexts. But, as this decision explains, it isn’t necessarily so in all contexts, and it risks harm to people and the liberty interests that protect them to ignore this nuance.
Meanwhile, from a Section 230 perspective, the decision meant that a platform got dragged through years and years of expensive litigation only to ultimately be exonerated. It’s amazing it even managed to survive, as many platforms needlessly put through the litigation grinder don’t. And that’s a big reason why we have Section 230, because we want to make sure platforms can’t get bled dry before being found not liable. It is not ultimate liability that can crush them; it’s the litigation itself that can tear them to pieces and force them to shut down or at least severely restrict even lawful content.
Section 230 is designed to avoid these outcomes, and it’s important that we not let our distaste, however justified, for some of the content internet users may create prompt us to make the platforms they use vulnerable to such ruin. Not if we want to make sure internet services can still remain available to facilitate the content that we prefer they carry instead.