Court Gets An Easy One Right: Section 230 Says Omegle Isn't To Blame For Bad People On Omegle
from the just-the-basics-here dept
Back in 2020, we had a post explaining that Section 230 isn’t why Omegle has awful content, and getting rid of Section 230 wouldn’t change that. Omegle, if you don’t know, is a service that matches people, randomly, into video chats. It’s basically the same thing as Chatroulette, which got super famous for a very brief period of time years ago. Both services are somewhat infamous for the unfortunately high likelihood of randomly ending up in a “chat” with some awful dude masturbating on the other side of the screen. But, still, there are a lot of people who like using it just for random chats. I have friends who are entertainers who like to use it to test out material on random people. It has a purpose. But, sure there are some awful people on the site, like many sites. And, content moderation of live video chat is quite a challenge.
For reasons I don’t quite understand, some people blame Section 230 for the bad people on Omegle, and there have been a few recent lawsuits that try to get around Section 230 and still hold Omegle liable for the fact that bad people use the site. As others have explained in great detail, if these lawsuits succeed, they would do tremendous harm to online speech. We’ve discussed all the reasons why in the past — but pinning liability on an intermediary for speech of its users is the best way to stifle all sorts of important speech online.
So, it’s good news to see that one of the first such cases against Omegle was recently dismissed on Section 230 grounds — and rather easily at that (story first noted by Eric Goldman). The case involved a situation which is, quite clearly, terrible. It involved what’s apparently known as “a capper.” As explained in the ruling:
Omegle, like many websites, is susceptible to hacking…. According to Plaintiffs, sexual predators have taken advantage of the anonymity that Omegle offers to prey on other users, including children…. Among these predators are ?cappers,? who trick children into committing sexual acts over live web feeds while simultaneously recording the encounters….
On March 31, 2020, C.H. was randomly placed in a chatroom with a capper during her first time on Omegle…. C.H. ? an eleven-year-old girl at the time ? accessed the Omegle platform from her laptop…. She was initially placed in a chatroom with other minors for some time…. C.H. later ended the chat with the minors and was placed in another chatroom…. She was met in the next chatroom with a black screen that began displaying text from the other anonymous user, ?John Doe.? … John Doe informed C.H. that he knew where she lived, and he provided specific details of her whereabouts to prove it…. He threatened to hack C.H. and her family?s electronic devices if she did not disrobe and comply with his demands…. After pleading with John Doe without success, C.H. complied…. John Doe captured screenshots and recorded the encounter…. Immediately after this incident, C.H. informed her parents, who then contacted law enforcement.
Now there is no way to describe this as anything but absolutely horrifying. The dude who did this should be thrown away for a long time. But he is the person committing the horrible crime here, not Omegle. And that’s what Section 230 helps clarify. So here, the court dismissed the case against Omegle:
First, Omegle is an ICS provider under Section 230. That is, Omegle is a system that allows multiple users to connect to a computer server via the Internet. 47 U.S.C. ? 230(f)(3). ICS providers are afforded immunity under the CDA unless they materially augment or develop the unlawful content at issue. See Fair Hous. Council of San Fernando Valley v. Roommates.com, LLC, 521 F.3d 1157, 1167-68 (9th Cir. 2008) (?a website helps to develop unlawful content, and thus falls within the exception to section 230, if it contributes materially to the alleged illegality of the conduct.?). Indeed, Plaintiffs appear to acknowledge that Omegle is an ICS provider by arguing that ?the rapidly evolving legal landscape . . . increasingly holds Internet Service Providers . . . liable for the harms they facilitate and oftentimes create.?…
Nonetheless, a review of the factual allegations confirms that Omegle functions by randomly pairing users in a chatroom and enabling them to communicate in real time. (Doc. # 75 at ?? 33-34). There are no factual allegations suggesting that Omegle authors, publishes, or generates its own information to warrant classifying it as an ICP rather than an ICS provider. Compare Doe v. Mindgeek USA Inc., No. SACV 21-00338-CJC(ADSx), 2021 WL 4167504, at *9 (C.D. Cal. Sept. 9, 2021) (finding that website was an ICP where it actively created programs, curated playlists, and developed private messaging systems to facilitate trafficking of child pornography) with Mezey v. Twitter, Inc., No. 1:18-cv-21069-KMM, 2018 WL 5306769, at *1 (S.D. Fla. July 17, 2018) (granting Twitter CDA immunity where it merely displayed, organized, and hosted user content). Nor are there any factual allegations that Omegle materially contributes to the unlawfulness of the content at issue by developing or augmenting it. See Roommates.com, 521 F.3d at 1167-68. Omegle users are not required to provide or verify user information before being placed in a chatroom with another user. (Doc. # 75 at ?? 37, 50-51). Further, some users, such as hackers and cappers, can circumvent Omegle?s anonymity using the data they themselves collect from other users during their encounters. (Id. at ? 38). The Court is persuaded that Omegle?s hosting capabilities for its users, coupled with its lack of material content generation, place it squarely within the definition of an ICS provider under 47 U.S.C. ? 230(f)(2).
The plaintiffs tried a bunch of arguments to get around 230, and all of them fail. One key one was arguing that Omegle’s design of the platform somehow gives it liability through “negligence”, but the court says that doesn’t work:
The other claims, Counts V, VII, and VII, confirm that Plaintiffs? theories of liability against Omegle are rooted in the creation and maintenance of the platform. These claims recognize the distinction between Omegle as an ICS provider and the users, but nonetheless treat Omegle as the publisher responsible for the conduct at issue. Yahoo!, 570 F.3d at 1101-02. This is corroborated in no small part by Count VII, the ?ratification/indemnification? claim, where Plaintiffs maintain that child sex trafficking was so pervasive on and known to Omegle that it should be vicariously liable for the damages caused by the cappers and similar criminals…. Through the negligence and public nuisance claims, Plaintiffs allege that Omegle knew or should have known about the dangers that the platform posed to minor children, and that Omegle failed to ensure that minor children did not fall prey to child predators that may use the website….
The CDA bars such claims as they seek to redirect liability onto Omegle for the ultimate actions of their users. See, e.g., Bauer v. Armslist, LLC, No. 20-cv-215-pp, 2021 WL 5416017, at **25-26 (E.D. Wis. Nov. 19, 2021) (dismissing, among others, negligence, public nuisance, aiding and abetting tortious conduct, and civil conspiracy claims, against ICS provider website that was used to facilitate unlawful firearm sales); Kik, 482 F. Supp. 3d at 1249-50 (website where users solicited plaintiff for sexual photographs was immune from sex trafficking, negligence, and strict lability claims where website only enabled user communication); Poole v. Tumblr, Inc., 404 F. Supp. 3d 637, 642-43 (D. Conn. 2019) (content hosting website entitled to immunity from invasion of privacy and negligent infliction of emotional distress claims); Saponaro v. Grindr, LLC, 93 F. Supp. 3d 319, 325 (D. N.J. 2015) (dismissing ?failure to police? claim against ICS provider under Section 230). Regardless of form, each of Plaintiffs? claims ultimately seek to treat Omegle as a publisher or speaker, which are encompassed within Section 230 immunity.
As the court notes, the person who did the wrong thing here was “John Doe,” not Omegle:
John Doe?s video feed, his brandishing of C.H.?s personal identifying information, and the threats he subjected her to were not provided by Omegle in any sense…. Merely providing the forum where harmful conduct took place cannot otherwise serve to impose liability onto Omegle.
There was, of course, also a FOSTA claim in the lawsuit. As you’ll recall, FOSTA created a new Section 230 exemption for sex trafficking. But, even with that, Omegle is not liable here, as the court notes that a site would need specific knowledge of sex trafficking, not “generalized knowledge” that the platform is sometimes used for sex trafficking.
As analyzed in the recent decision of Doe v. Kik Interactive, Inc., the legislative history of the CDA confirms that generalized knowledge that sex trafficking occurs on a website is insufficient to maintain a plausible 18 U.S.C. ? 1591 claim that survives CDA immunity. 482 F. Supp. 3d 1242, 1250 n. 6 (S.D. Fla. 2020). The plaintiff in Kik alleged that multiple users on the Kik website solicited her for sexually explicit photographs. Id. at 1244. She then brought claims against Kik for violations of 18 U.S.C. ?? 1591, 1595, negligence, and strict liability. Id. at 1245-46, 1251. The Kik court found that Kik would not be immune from suit only if it were alleged that Kik had actual knowledge of the underlying incident and had some degree of active participation in the alleged sex trafficking venture. Id. at 1250-51. The Kik plaintiff did not assert actual knowledge or overt participation on behalf of Kik, and instead asserted that Kik had general knowledge of other sex trafficking incidents on the website. Id. at 1251. Thus, the Kik court found that Kik was entitled to Section 230 immunity because plaintiff had not plausibly alleged a claim that would surmount Section 230 immunity. Id.; see also Reddit, 2021 WL 5860904, at *8 (dismissing 18 U.S.C. ? 1591 claim for failure to plead that ICS provider knowingly participated in a sex trafficking venture).
The requirement for actual knowledge, as opposed to generalized knowledge, seems to annoy some people, but it’s the only reasonable standard. If generalized knowledge were enough to create liability, how would a site respond? It would shut down all sorts of speech, trying to overblock for fear of any liability. Expecting a website to magically figure out how to stop bad people from using it is an impossible task. And, it really takes away from the simple fact that you should hold the people who did the criminal acts liable for the criminal acts and not the providers of the tools they use.
Other such cases should face a similar end. I understand that people are upset that there are bad people on these platforms doing bad things — and that some kids use these platforms. But there are better ways to deal with that: namely (1) holding those people who actually violate the law responsible for their own criminal acts, and (2) better educating our children on how to use the internet and what to do if they come across a dangerous situation like this.