Just Like Every Other Platform, Parler Will Take Down Content And Face Impossible Content Moderation Choices
from the this-is-not-the-panacea dept
Like Gab before it, the hot new Twitter-wannabe service for assholes and trolls kicked off of Twitter is Parler. The President and a bunch of his supporters have hyped it up, and the latest is that Senator Ted Cruz (and Rep. Devin Nunes) have recently joined it, and like others before them they have hyped up the misleading claim that Parler supports free speech unlike Twitter. Cruz — who has been spewing blatantly false information about “anti-conservative bias” on various internet platforms — even announced his move to Parler… on Twitter, which does not seem to be moderating him at all. Cruz’s overwrought speech is full of nonsense that has come to typify his pathetic attempt to win fans among Trump’s base.
But, I did want to take a closer look at the claims that Parler supports free speech, because it does so in basically the same way every other platform — including the way Twitter, Youtube and Facebook do: by saying that they can remove your content for any reason they want. Their user agreement includes this:
Parler may remove any content and terminate your access to the Services at any time and for any reason or no reason, although Parler endeavors to allow all free speech that is lawful and does not infringe the legal rights of others. Any invitation made by Parler to you to use the Services or submit content to the Services, or the fact that Parler may receive a benefit from your use of the Services or provision of content to the Services, will not obligate Parler to maintain any content or maintain your access to the Services. Parler will have no liability to you for removing any content, for terminating your access to the Services, or for modifying or terminating the Services, at any time and in any way and for any reason or no reason. Although the Parler Guidelines provide guidance to you regarding content that is not appropriate, Parler is free to remove content and terminate your access to the Services even where the Guidelines have been followed.
Parler should be thankful that it has Section 230 of the CDA to make that possible. And it should probably be ticked off at Cruz, who has been among those threatening to revoke Section 230.
My favorite line is the last one, which says that it can remove content or terminate your account even where you have followed its Guidelines.
Under various proposals to reform Section 230, this would go against the law, but Parler is actually doing the right thing here. If you only limit your moderation powers to what is explicitly in your terms, then people will game those terms and cause problems on your platform. You need the flexibility to deal with bad actors — the flexibility that Section 230’s current structure provides.
And while Parler’s Community Guidelines are written in a manner that makes it look like they’re mimicking 1st Amendment jurisprudence, that’s a trick they’re playing, because the specifics do not match the reality. First, at the very top, they say that no spam is allowed:
Spam is repetitive content that does not contribute to the conversation. It often comes in the form of multiple posts of repeating content that offer little to no value to the community and platform at large.
And the guidelines tell users:
- Avoid repetition in the comment section of posts. Spam applies more heavily to comments then posts.
- Do not use language/visuals that are meant to take advantage of others on Parler.
- Avoid language/visuals that solicits advertisements on other?s posts.
Of course, all of that is 1st Amendment protected speech.
Parler also bans sharing “rumors about other users/people you know are false.” And while they couch this as being the same as defamation, the legal standards for defamation go way beyond that. Banning “rumors about other users/people you know are false” will create judgment calls by Parler in determining what stands and what doesn’t.
In the section meant to mimic the Supreme Court’s (mostly obsolete) “fighting words” doctrine, Parler again says that plenty of 1st Amendment protected speech is barred from its platform.
Any direct and very personal insult with the intention of stirring and upsetting the recipient
Of course, intention is subjective, meaning again that Parler would need to make judgment calls.
Parler, like Gab before it, bans pornography, falsely claiming:
Pornography is considered indecent according to clauses defined by the FCC.
The FCC polices public airwaves, which come from publicly owned, but corporate-held, spectrum. The FCC’s determination of indecency has no bearing on the internet (and does have some 1st Amendment issues as well). Parler’s definition of porn is… really weird.
Printed text description, or visual material containing the explicit description or display of sexual organs or activity. Porn must meet ALL the following conditions:
- Porn does not require nudity
- Can be an image, painting, art, or description
- It must be morbid or degrading in nature (Prurient)
Of those “conditions” only the last one is actually a “condition.” If the others are conditions, does that mean if it does have nudity, it’s no longer porn?
There’s a lot more where this comes from, but almost all of it appears to be written by someone who did a Wikipedia search on exceptions to the 1st Amendment, but didn’t bother to talk to a 1st Amendment lawyer to understand what those exceptions actually meant.
Still, there is a larger point in all of this, which is something we’ve tried to explain to people over and over again. There is no such thing as not moderating content. First of all, some content moderation is required by law — especially things like child sexual abuse material and copyright infringement. Second, there are international issues that Parler will eventually need to deal with, even if it’s an American company. Already, some have pointed out how Parler’s user agreement might put users on the legal hook for international issues. Third, without content moderation, your site gets filled with junk, spam, and abuse. Even Parler seems to implicitly recognize this with its terms.
There have been plenty of sites that have sprung up over the years that first promise no moderation until they realize what that means in practice — and then suddenly they realize that some level of content moderation is a necessity. Now, Parler may take a more hands off approach than others, and that’s great. Different approaches and different levels of experimentation should be encouraged. But the idea that Parler is somehow taking a substantially different approach than a site like Twitter is nonsense.
On a related note, Parler’s sudden burst of attention and usage should serve to highlight another nonsense talking point from the world of Trump: that the existing large platforms (namely: Facebook, Twitter, YouTube, Instagram) and their content moderation decisions are somehow a form of censorship or control of the “public square.” The great thing about the internet is that it’s still (mostly) open for other entrants to try to build a better mousetrap. So the idea that the other platforms need to be hit with regulations over their content moderation practices seems odd when Parler itself has demonstrated that it’s totally possible to build competitors.