Rep. Louie Gohmert Wants To Strip Section 230 Immunity From Social Media Platforms That Aren't 'Neutral'

from the making-everything-'fair'-by-making-it-suck-for-everyone dept

Rep. Louie Gohmert is one of the most technologically inept Congressmen we have the misfortune of being “served” by. Getting to the top of this list isn’t easy. The halls of Congress are filled with people who truly don’t understand the tech they’re attempting to regulate. Nor do they appear to be making any effort to educate themselves. Gohmert, however, seems to believe his position as an elected official gives him tech smarts he actually doesn’t have, so he spends a great deal of time embarrassing himself when grilling tech reps during Congressional hearings.

Gohmert was one of the participants in the Social Media Bloodsport Hearings of 2018. Held over the course of several months, the hearings were 75% grandstanding and 20% misunderstanding the issues at hand. Social media services have been hit hard recently for appearing to bury/deplatform right-wing accounts while simultaneously allowing the platforms to be overrun with foreign state-operated bots. It’s ugly but the ignorance displayed by Gohmert and others during the hearings was just as galling.

It was at these hearings a new myth about internet platform immunity came into being. Somehow, these lawmakers looked at Section 230 of the CDA and decided it required platforms to be “neutral” to avail themselves of this protection. A Senate hearing in April featured Rep. Ted Cruz demanding to know if Facebook considered itself a “neutral public forum.” Mark Zuckerberg said he’d look into it, claiming he wasn’t familiar with the “specifics” of the “law [Cruz] was speaking to.”

Bad answer. And the bad answer made Cruz look like he’d just played a successful round of “Stump the Tech Magnate.” But he had done nothing more than state something not backed by actual law. That should have been the end of it, but people who really wanted to believe Section 230 immunity requires “neutral” moderation used Cruz’s ignorance as the starting point for stupid lawsuits almost certainly destined for quick dismissals.

It’s one thing for the public to make bad assumptions about federal laws. It’s quite another when federal lawmakers do it. Rep. Gohmert, playing to the home crowd [read the replies], has declared he’s going to strip immunity from service providers who “use algorithms to hide, promote, or filter user content.”

That would be all service providers. Gohmert wants to strip immunity from all platforms solely because he believes in Ted Cruz’s ignorant fiction. The bill hasn’t been written yet, but the statement issued by Gohmert explains the basis for this incredibly idiotic legislation proposal:

Social media companies like Facebook, Twitter, and Google are now among the largest and most powerful companies in the world. More and more people are turning to a social media platform for news than ever before, arguably making these companies more powerful than traditional media outlets. Yet, social media companies enjoy special legal protections under Section 230 of the Communications Act of 1934, protections not shared by other media. Instead of acting like the neutral platforms they claim to be in order obtain their immunity, these companies have turned Section 230 into a license to potentially defraud and defame with impunity.

Section 230 does not require neutrality. It never has. It does not forbid content moderation. It actually encourages good faith efforts to keep platforms free of content they don’t want. Twitter and Facebook could remove every right-leaning account on their platforms without losing Section 230 immunity — which solely shields them from being held liable for content posted by third parties. It does not insulate them from charges of fraud or defamation if, in fact, either of these were committed by the companies, rather than their users.

For Gohmert’s proposal to work, he would either need to add the missing “neutrality” component or do away with Section 230 immunity altogether. Both of these are terrible ideas. Neutrality would be impossible to define, much less enforce. And the removal of immunity would mean the end of social media platforms as we know them, as companies will not be willing to be sued for content created by platform users.

Gohmert’s disingenuous idiocy doesn’t end there.

In one hearing, one of the internet social media executives indicated a desire to be treated like Fox News. Fox News does not have their immunity and this bill will fulfill that unwitting request. Since there still appears to be no sincere effort to stop this disconcerting behavior, it is time for social media companies to be liable for any biased and unethical impropriety of their employees as any other media company. If these companies want to continue to act like a biased medium and publish their own agendas to the detriment of others, they need to be held accountable.

The difference between Fox News and Twitter is Fox News creates the content it publishes. Twitter does not. That’s why Twitter has immunity and Fox News doesn’t. Maybe some tech exec said something stupid during a stupid hearing filled with chest-beating and misconceptions, but that doesn’t make Gohmert’s proposal any less moronic.

Make no mistake: the same people agitating for “neutral public forums” are the people who will be deplatformed first if Section 230 immunity is removed. It’s already happening while the immunity remains in place. Anyone trafficking in controversy will be shown the door before they can do any damage to the platforms that used to host them. If you want more blanket moderation and faster banning, by all means, gripe about immunity and neutrality. If you actually value the free flow of speech, keep dimwits like Gohmert out of office.

Filed Under: , , , , , , ,

Rate this comment as insightful
Rate this comment as funny
You have rated this comment as insightful
You have rated this comment as funny
Flag this comment as abusive/trolling/spam
You have flagged this comment
The first word has already been claimed
The last word has already been claimed
Insightful Lightbulb icon Funny Laughing icon Abusive/trolling/spam Flag icon Insightful badge Lightbulb icon Funny badge Laughing icon Comments icon

Comments on “Rep. Louie Gohmert Wants To Strip Section 230 Immunity From Social Media Platforms That Aren't 'Neutral'”

Subscribe: RSS Leave a comment
115 Comments
That One Guy (profile) says:

Re: Re: 'We can survive a few lawsuits, can YOU?'

More than that, you could argue that 230 protects those smaller individuals/groups more than it does ‘big internet companies’, as holding the platform liable stands to do a lot more damage to those that don’t have massive amounts of resources for excessive filtration and/or the inevitable lawsuits.

A platform like YT can survive several lawsuits by some idiot going after them rather than the one who posted the content being sued over, but for a smaller platform even one lawsuits could very well drive them under and cause the platform to shut down as not worth the risk.

Much like the stupidity in the EU over the link tax and mandatory filters(despite lies to the contrary) efforts to undermine 230 stand to help large companies more than they stand to hurt them, with the small fry(individual blogs, up and coming platforms that might provide competition to the bigger players, and so on) being the ones who stand to lose the most.

Anonymous Coward says:

Re: Re: Re:

A doctor who receives a blackmail letter from a Russian who says “Pay me 410,000 or I ruin your reputation online” would be harmed.

Section 230 is also at the root of false advertising, hate marketing, cyberbullying, etc. People have won large judgments in Australia and the UK over what ppears in search results.

Section 230 literally immunizes those who inflict the harm of defamation (search engines), to the point where even if one “sues the original publisher” the engine still doen’t have to remove what was posted.

This is unique to the US. it is not the law globally, for a reason. ONe judge in NY wondered why he couldn’t sue ebay when someone put him up for sale btw.

Anonymous Coward says:

Re: Re: Re: Re:

The doctor in the hypothetical scenario would be potentially harmed … by the Russian. So, sure, let’s sue the various platforms because of what the Russian did, but let the Russian get off scot-free.

The next two lines? Please present evidence in support of your assertions. I have no reason to believe what you have said, since they are simply assertions and nothing else.

Final line:
That judge should realize that it’s because eBay did not put him up for sale. He can try and sue the guy who posted it. But eBay didn’t post it. One judge’s woeful lack of understanding is… one judge’s woeful lack of understanding.

Remember, 230 says that if some dude walks into the park you own and commits a crime, you can’t be sued just because you own the park.

John Roddy (profile) says:

Re: Re: Re: The amount of wrong…

Section 230 is also at the root of false advertising, hate marketing, cyberbullying, etc.

No, no it is not. That would be caused by shitty people being shitty people. Specifically, shitty people who are absolutely liable for their own shitty actions. Section 230 does nothing to stop anyone from going after them.

People have won large judgments in Australia and the UK over what ppears in search results.

Which makes no sense at all. Why on earth would the platform need to be liable for that?

Section 230 literally immunizes those who inflict the harm of defamation (search engines), to the point where even if one "sues the original publisher" the engine still doen’t have to remove what was posted.

Section 230 does not immunize anyone who defames. Search engines are also not the ones doing the defamation in whatever example you’re thinking of. That would be like suing the company that made the megaphone when someone yells a lie through it.

Oh, and as an added bonus, your note about the engine not having to remove the offending content is flat out wrong. This has happened many times before. They can refuse to remove it in some cases involving default judgements, but that’s an exception.

This is unique to the US. it is not the law globally, for a reason. ONe judge in NY wondered why he couldn’t sue ebay when someone put him up for sale btw.

I believe it was also a judge in NY who posited that prior restraint was the answer to all problems of "Internet" as well. People say stupid things, and judges are no exception. Neither are you.

Mike Masnick (profile) says:

Re: Re: Re:2 The amount of wrong…

Oh, and as an added bonus, your note about the engine not having to remove the offending content is flat out wrong. This has happened many times before. They can refuse to remove it in some cases involving default judgements, but that’s an exception.

Just to clarify here, while I disagree with the larger argument the comment your responding to is making, he is correct that under 230 a search engine is not legally required to remove content, even after it’s been determined by a court to be defamatory. What that commenter conveniently ignores, is that nearly every search engine WILL remove such content upon receipt of a legitimate court ruling to that effect (meaning that the "harm" the commenter describes, is mostly mythical).

Indeed, the fact that most sites will remove such content when presented with a court ruling to that effect created a big business in creating fake court orders (or worse, using fake "defendants" who quickly "settle" allowing a misleading "agreement" to be sent to the search engine in an effort to delete access to content someone dislikes).

John Roddy (profile) says:

Re: Re: Re:3 The amount of wrong…

Ah yes, that’s right. “Defamatory” content often does get “taken down” after a proper court order, but not necessarily because the law requires it. If they refuse, they won’t lose their overall immunity, after all. Once again, it’s a point that has nothing to do with Section 230 in the first place.

PaulT (profile) says:

Re: Re: Re: Re:

“A doctor who receives a blackmail letter from a Russian who says “Pay me 410,000 or I ruin your reputation online” would be harmed.”

He would be harmed by the Russian, not by anything related to section 230. Section 230 is not involved, either in the threat, or in the doctor’s options n going after the people responsible for the blackmail and any harm caused.

All that section 230 means is that the doctor can’t sue the services used by the Russian because they’re easier to find than the Russian himself, just as they’re not allowed to sue the post office for delivering the initial letter. The Russian is still culpable, the doctor would just have to go after him for reparations rather than the nearest related innocent 3rd party.

As ever, you’re lying about the very root of the issues you are opposing. All you are rooting for is that more innocent parties be held liable for things they did not do.

James Burkhardt (profile) says:

Re: Re:

Section 230 protects more than big business. No one is harmed by section 230 excepting that they can’t sue a big company for something a third party said.

Just like a newspaper couldn’t be sued for accurately reporting what a third party said (we had that article last wee), we don’t hold Facebook responsible for accurately displaying the speech of its users.

If a statement does cause a harm that creates legal liability, the harmed party can still go after the party that caused the harm. But [insert big business here] did not cause the harm.

Christenson says:

Re: Section 230

+1 Troll Point. You are Rep Gohmert’s alter ego!

Look, many of us may feel that by their very size, the big internet platforms (facebook, etc) should be required to act more like common carriers than a small website like Techdirt.

We may even think that little guys are more important than big companies.

But that is not section 230. Section 230 came about because on Compuserve, some evil individual posted some bad stock advice, and people believed that person. When the financially injured individuals sued, they also sued Compuserve, and Congress agreed that wasn’t right, because without some legal protection, Compuserve would not allow much of any content.

Section 230 gives the content host the legal right, but not an obligation to moderate whatever you or I may post without incurring legal liability as if the host had written the material in any way the host sees fit. For example, Daily Stormer and Techdirt are free to kick me off for no reason at all.

My only recourse for such arbitrary behavior is to find another platform. Now, when we talk google or facebook or twitter or some other huge platform, there might not be another platform, because they are so large, and I think we need much better anti-monopoly efforts. But the law does not say this; the large platforms are only fairly neutral because of the commercial incentives.

And before you go saying the law should require neutrality, consider that Techdirt has done a good job of demonstrating that moderation at scale is an unsolved and likely unsolvable technical problem.

cpt kangarooski says:

Re: Re: Section 230

But that is not section 230. Section 230 came about because on Compuserve, some evil individual posted some bad stock advice, and people believed that person. When the financially injured individuals sued, they also sued Compuserve

Actually it was because a good person posted good stock advice.

You’re confusing the two major cases on this that predate the passage of the CDA. Cubby v. Compuserve (in which people using Compuserve disparaged a startup media business) came out in favor of the service provider; Conpuserve was not liable. It’s the Stratton Oakmont v. Prodigy case that dealt with stock advice. And the advice was good, which was that Stratton Oakmont was an untrustworthy firm. Turns out, they were untrustworthy— they’re the trading firm in Wolf of Wall Street, if you saw that movie. That case held the other way, that Prodigy, their service provider, was liable for statements of their users because they could and did moderate their boards.

It’s one of the delightful ironies of the whole thing.

Anonymous Coward says:

Re: Re:

“Anyone who defends Section 230”
I do not own or run a “big internet company”.

My company’s annual profit is barely $50k.

It would be 4x this amount if it wasn’t for bogus DMCA take down notices we receive simply because some fucking asshole blankets companies using KEYWORDS. Not actual content. FUCKING KEYWORDS.

If the alternative is an ICE takedown due to finger pointing accusations of “infringement” (based on KEYWORDS), then I’ll defend Section 230.

Also, fuck you for thinking this law only applies to “big internet companies”.

Seriously, FUCK. YOU.

James Burkhardt (profile) says:

Re: Re: Re: Re:

My guess, he is making a somewhat hyperbolic value statement based on the the amount of downtime he faces due to DMCA notices impacting the uptime of his website combined with revenues earned when DMCA notices are not affecting his revenue potential.

The implication that these take downs are not shutting down his business, but imposing downtime, highlights a lack of validity in the claims, implying abuse of those takedown notices.

He is not claiming entitlement to that profit, but rather that he is suffering downtime due to the abusive behavior of others misusing the DMCA, and that based on current revenue trends, the downtime is significantly impacting his revenues.

His number (4 times current revenue) might not be accurate, but significant downtime on a website that would be the main avenue for the advertisement and sale of a product or service would almost certainly impact revenue.

Nor does any of that being false undermine his claims that SEC 230 defends his small business. Even if he couldn’t earn another penny, Sec 230 protects him as much as anyone else.

cpt kangarooski says:

Re: Re: Re: Re:

He is most likely saying that complying with baseless DMCA notices is a substantial cost and that but for that cost, if existing revenues held steady, he would be able to quadruple his profits without actually having to get more clients, do more work, make capital investments, hire more staff, etc. So while it’s still speculative, it’s a lot better grounded than idiots who think that if only piracy were impossible, everyone would line up to buy legitimate copies of things.

Anonymous Coward says:

Re: Re: Re:3 Re:

It’s a crappy pseudonym using a sad pun, plus an obligatory dig at the site because “muh content creators”.

blue isn’t exactly subtle when it comes to things he disagrees with even if it technically meshes with his worldview. He doesn’t care a jot for creators, just the corporations that buy them over.

That Anonymous Coward (profile) says:

“Introduced a bill today that would remove liability protections for social media companies that use algorithms to hide, promote, or filter user content.”

So ContentID would open YouTube up to liability?

It is so wonderful to see them pandering to the base who believe the lies. I wonder if we shoudl have a law that says everytime a Congresscritter lies they should be ejected from office. I have to think the sheer number of lies they tell dwarfs the actions of media platforms who have every right to run their business any fscking way they want.

We are the home of the brave, land of the free… unless we think there is a conspiracy then we can throw out all of the rights so we can force you to behave how we want!! They are just as bad or worse than the SJW who demand changes & once the camels nose in in the tent the entire herd comes in. (See also: Ban R.Kelly k thks, now here is a list of other artists we feel shouldn’t be on the platform!!!)

James Burkhardt (profile) says:

Re: Re: Re: Re:

Well, the other option was to not go above and beyond. I wouldn’t be upset about Content ID being declared illegal, given its efforts to appropriate the copyright of creators and assign them to other users.

The issue being that the current proposal also makes illegal all of search, any ability to filter an information feed, almost any ability for me to find content I am looking for. Because you need to use algorithms to do it.

PaulT (profile) says:

Re: Re: Re:2 Re:

“Well, the other option was to not go above and beyond”

You’re still blaming the wrong people if you’re blaming YouTube. The reason they went “above and beyond” is because they were being sued for all sorts of random shit – even content the labels had uploaded themselves – and the climate at the time was that the major content groups were too scared to offer legal channels. It was necessary to show some proactive filtering to avoid a situation where a court found them liable for encouraging infringement. If that had happened, no service like YouTube would have been legally allowed. Forget the occasional mistakes you’re talking about – how about if the independent creators didn’t have a platform at all?

ContentID is a pain in the arse for many reasons, but leave the blame for it where it belongs. Watching Google suffer will do absolutely nothing to address the problems that created ContentID, and may in fact encourage far worse ways of dealing with problems. You’re attacking YouTube because they occasionally make mistakes while filtering millions of videos an hour, but not attacking those who require the filters to be in place – and against YouTube’s original wishes.

Mason Wheeler (profile) says:

Re: Re: Re:3 Re:

It was necessary to show some proactive filtering to avoid a situation where a court found them liable for encouraging infringement.

No it wasn’t. What was necessary was to push back and say "this is what the law requires, this is all the law requires, and screw you if you want more than that." Instead, they tried to appease the extortionists, and the result was… well… exactly what you’d expect from someone who tries to appease extortionists. And we’re all worse off for it.

PaulT (profile) says:

Re: Re: Re:6 Re:

You forget that at the time, it was YouTube being accused of the being the extorting party. If they didn’t do something, they faced a death of 1000 cuts as they were hammered with lawsuits, any one of which would have killed them if a judge ruled the wrong way.

Again, ContentID is crap, but to blame YouTube and only YouTube is not only blaming the victim, it’s utterly ignorant of what was happening at the time.

PaulT (profile) says:

Re: Re: Re:8 Re:

“Where did I say I’m blaming only YouTube?”

Did you attack anyone else in your rant? I must have missed that.

“Yes, they didn’t create the problem, but they absolutely did make the problem worse”

They really didn’t if you actually understood the other options that were on the table at the time. Would you prefer YouTube to have been shut down, and thus everyone else who didn’t have Google’s weight behind them?

Mason Wheeler (profile) says:

Re: Re: Re:9 Re:

Did you attack anyone else in your rant? I must have missed that.

You apparently missed the meaning of the word “extortion” if you think I didn’t blame anyone else for what happened.

Once again, typing this nice and slow so you’ll comprehend it: simply because one party did something wrong doesn’t mean that someone else responding to it couldn’t have also done something wrong that made the whole situation worse.

> They really didn’t if you actually understood the other options that were on the table at the time. Would you prefer YouTube to have been shut down, and thus everyone else who didn’t have Google’s weight behind them?

Mason Wheeler (profile) says:

Re: Re: Re:5 Re:

Reacting to extortion in the wrong way is something blameworthy, because it legitimizes the extortion and empowers it. YouTube refusing to stand up for their rights–for our rights–to flip the script and take the fight to the enemy, and choosing to appease them and collude with them instead, is directly responsible for a large part of the mess we find ourselves in today.

There’s a reason why phrases like "millions for defense but not one cent for tribute" and "we do not negotiate with terrorists" are a thing. Because when you go outside of that mindset, this is what happens.

Anonymous Coward says:

Re: Re: Re:6 Re:

YouTube refusing to stand up for their rights–for our rights–to flip the script and take the fight to the enemy, and choosing to appease them and collude with them instead, is directly responsible for a large part of the mess we find ourselves in today.

It is also part of the reason that YouTube still exists, as the people attacking them are experts is using the legal system to bankrupt people who do not give in to their demands. When the legal system does not allow you to recover your costs, you can easily go bankrupt while winning your case, and YouTube could have won the case, and gone out of business because of the cost of winning.

PaulT (profile) says:

Re: Re: Re:6 Re:

“Reacting to extortion in the wrong way is something blameworthy”

But, not worthy of the majority of blame in the situation, and certainly not something that needs you gloating like an idiot over the entire industry suffering because you’d prefer them to have done something different.

Stop being a prick, and place the blame where it belongs.

“There’s a reason why phrases like “millions for defense but not one cent for tribute” and “we do not negotiate with terrorists” are a thing. “

YouTube had 2 choices at the time – play the game as it was being presented, or not only have themselves shut down but have a legal precedent on the books to ban user created content online completely.

You’re pretty pathetic if you think they should have chosen the second option.

Mason Wheeler (profile) says:

Re: Re: Re:7 Re:

False dichotomy, and if you really think that some movie studio had a big enough war chest to do to Google, of all companies, what was done to Veoh, you have no room to be throwing around words like "pathetic."

What Google could have done to resolve this, if they actually had the spine to, was to take one look at the lawsuit and say "let’s handle this like businessmen," and initiate a hostile takeover of Viacom. (Among other things. There were other very viable options available, such as sticking to the actual law and pushing back against attempts to abuse the court system to expand it beyond its bounds, but that’s the one I like the best. "Play the game as it was being presented" when the person presenting it is your enemy who is making it up as they go was the worst possible response.)

PaulT (profile) says:

Re: Re: Re:8 Re:

Wow, there’s so much there I don’t know how to uppack it all.

First off, Viacom is hardly “some movie studio”. Would Google in 2008 have been able to just buy them out? Almost certainly not. But, even if they did – so what? The Viacom case is only notable because of clearly stupid it was (they listed content they’d uploaded themselves as infringing). There were hundreds more already in place, and more and more people were lining up with lawsuits both frivolous and justified, that number increasing as the sharks smelt the blood in the water. Should they just buy everyone out? Where does it end?

Then, this still wasn’t a core part of Google’s business. They could have taken a major hit and dropped YouTube completely and survived. Veoh was a video hosting company. Google was an advertising and search company that happened to have bought a video host. It would have hurt, but they could have said “screw this, we’re no longer interested” and sold YouTube off or written it down. Even if they didn’t we’d still be facing the same lawsuits right now.

They would have been fine. But, the rest of the sector with new legal precedents and investor uncertainty would have been screwed. Especially given that even Google weren’t able to negotiate a lot of its licensing deals until after this was in place. How would anyone else have had a chance?

ContentID is far from perfect, but again it’s better than the realistically alternatives, and no, buying the people suing them is not realistic.

Mason Wheeler (profile) says:

Re: Re: Re:9 Re:

But teh lawsuits! But teh lawsuits! But teh lawsuits! But teh lawsuits!

What about them? Once again, for the zillionth time, the law was on YouTube’s side! (As evidenced by the fact that they won the case!) You keep ignoring this simple fact. ContentID was never necessary. The law was on their side, and if they had said "screw you, the law is on our side and we’re sticking to our guns," that "new legal precedent" you keep talking about would indeed have come down, but it would have been a good one. But they didn’t do that. They caved and took the easy way out, and that did set a precedent–not one in a court of law, but a very real precedent notwithstanding–that’s been screwing over everyone except the major publishing interests to this very day. And that was the wrong thing for them to have done.

I don’t know how it’s possible for you to know as much about this whole issue as you evidently do, and yet remain unaware of this basic fact. But somehow you’ve managed it!

PaulT (profile) says:

Re: Re: Re:10 Re:

“(As evidenced by the fact that they won the case!) “

FFS, you’re missing the fact that this was ultimately irrelevant. One of the core components of those cases being brought forwards was the idea that YouTube was somehow complicit in infringing because they didn’t do anything proactively to look for or filter infringing content.

You’re focussing on the one lawsuit, I’m focussing on the entire bodies of hundreds, or maybe even thousands upcoming that used the same types of arguments – and those others might have had more compelling evidence than the obvious ridiculous Viacom one.

Sure, they may well have won all the cases eventually, but they’d probably still be fighting them now with zero movement forward in the industry.

Mason Wheeler (profile) says:

Re: Re: Re:11 Re:

One of the core components of those cases being brought forwards was the idea that YouTube was somehow complicit in infringing because they didn’t do anything proactively to look for or filter infringing content.

Yes, and that’s my entire point: this is what they call a "novel legal theory," which is legalese for "you’re making up stupid crap that has no foundation in the actual law." The law did not require them to proactively look for or filter infringing content. In fact, it specifically said they had no obligation whatsoever to do so. The. Law. Was. On. Their. Side.

You’re focussing on the one lawsuit, I’m focussing on the entire bodies of hundreds, or maybe even thousands upcoming that used the same types of arguments

I’m focusing on one lawsuit because one is all it takes. All they had to do was get the one win, and then they could use that as a precedent to say "this case needs to be thrown out because it’s based on the exact same faulty reasoning as this other case that we won" for all the rest of them.

PaulT (profile) says:

Re: Re: Re:12 Re:

“The. Law. Was. On. Their. Side.”

It took 6 years and multiple court proceedings to settle the case (ultimately settled out of court), and Viacom specifically changed their demands part way through as a direct result of ContentID being put into place.

https://www.nytimes.com/2010/06/24/technology/24google.html

“To a large extent, the case addressed past conduct, as Viacom said it was not seeking damages for any actions since Google put in its filtering system, known as content ID, in early 2008.”

In other words, had they not created ContentID, they would have continued to seek damages for current activity, about which they may have been able to formulate a good enough argument to win.

“All they had to do was get the one win, and then they could use that as a precedent”

…and all they needed was one loss and it could be used as precedent against not only YouTube, but every other video provider that didn’t have Google’s warchest. Had Google not created ContentID, and had Viacom been competent enough to not include videos they uploaded themselves as the primary evidence, there is a risk that either they could have lost, or that Google decided the potential losses wren’t worth the potential gains and written off YouTube completely before the case was settled, leaving the entire sector at risk.

Again, ContentID is not particularly well implemented and is annoying as hell, but you’re deliberately missing the context about why it exists.

Mason Wheeler (profile) says:

Re: Re: Re:9 Re:

Also, your timeline is all wrong. You seem to be under the impression that the lawsuits caught Google unawares, but nothing could be further from the truth; they were already taking place, against the independent company YouTube, at the time Google chose to rescue them by buying them out. So once again, you’re arguing on the basis of factually incorrect data.

Mason Wheeler (profile) says:

Re: Re: Re: Re:

Gotta love the false dichotomy here.

What I’d prefer is a situation that puts copyright back in its proper place, with its proper perspective, rather than the inside-out insanity of the current regime.

What I’d prefer is for principles of Due Process and the Presumption of Innocence, which are quite uncontroversial in other contexts, to be applied here: that content accused of copyright violation is innocent until proven guilty in a court of law.

What I’d prefer is for great power to actually come with an unavoidable great responsibility, rather than bringing with it the power to dodge responsibility, as is all too often the case.

Anonymous Coward says:

Re: Re: Re:4 Re:

Oh yeah for sure, I get that part. But now it seems others are pivoting to discussing broader online copyright issues as a key aspect of analyzing s.230 – and that is just a crazy, messy way to go about this. Let’s not start with the one huge area of speech that is specifically excluded from s.230 and covered by its own entirely different piece of safe harbors legislation.

Christenson says:

Re: Re: Content ID

TAC was pointing out unintended consequences.

I’m pretty sure TAC thinks ContentID is a problem because it is used mindlessly for malicious purposes. See Dancing Baby case. But algorithmic filtering can also be extremely useful.

Here’s another: Techdirt does a certain amount of algorithmic content moderation. My accidental blank-body posts disappear nicely, thank you. Should such rules apply to Techdirt? What about our voting which posts to hide by flagging?

Uriel-238 (profile) says:

Re: Penalizing federal and state officials for lying

I’m pretty sure there are laws that do this but are just not enforced. It’s a felony to lie to an official, but when a congressperson lies on the floor, that’s exactly what they’re doing.

But our prosecutors don’t go after officials any more than they go after law enforcement officers.

In the meantime, a law that prevents social media sites from moderating would stand in opposition to many other current customs that are at least defined by precedent if not by law. It would mean copyright could not be enforced, nor advertisements for trafficked humans. It would also be impossible to comb out child porn.

Anonymous Coward says:

Re: Re:

Never mind ContentID – that list includes "promote", so even a "suggested videos" or "trending videos" or "watch next" link would lead to liability. It seems that even allowing a user to completely manually select a set of topics they are interested in, then showing them algorithmically selected videos that relate to those topics, would count. I mean really everything that happens on every website in any way involves several "algorithms" – loading a video and playing it after someone clicks on it is an algorithm; returning even completely-unfiltered results for a search is an algorithm.

Of course we haven’t seen the full bill yet, so in theory it could be written with exceptions for those kinds of cases. And I am totally completely confident that Rep. Gohmert and those he will work with are fully technologically literate enough to do that properly. Aren’t you? 😉

Anonymous Coward says:

Re: Re:

For the love of all I would just want to see the look this “congressmen’s” face when he tries to make a comment on a site and sees it filtered goes back to the chamber to complain and then is told that HIS OWN BILL is the reason why he can’t talk and that if he keeps doing what he is doing it will continue and nothing can be
Done PER LAW however this comes out becuase you know he has no idea how to use anything other then a typewriter barely.
And if he has a problem to DEAL WITH IT.

Anonymous Coward says:

Section 230 is part of the Communications Decency Act of 1996, not the Communications Act of 1934.

Either representative Gohmert doesn’t know how to find the law he wants to change, or he’s trying to imply that he’s fixing an old law that doesn’t apply to the modern internet.

Either way, that’s really sloppy.

cpt kangarooski says:

Re: Re:

No, it’s just that it’s sloppy as hell to refer to laws in this way.

In 1996, Congress enacted the Telecommunications Act of 1996, which amended the Telecommunications Act of 1934. Part of the 1996 Act was the Communications Decency Act of 1996, and part of that was Section 509, which starts out by saying “Title II of the Communications Act of 1934 (47 USC 201 et seq) is amended by adding at the end the following new section: ‘Section 230. Protection for Private Blocking and Screening of Offensive Material’” and it goes on from there.

When people say section 230 of the Communications Decency Act, they really mean 47 USC 230. Using the actual US Code cites is way more useful. No one looks stuff up by the public law once it’s enacted.

ECA (profile) says:

REALLY?? Let him do it..

https://www.state.gov/j/ct/rls/other/des/123085.htm

Then every group on this list can go onto the net and say and SHOW as they please..
YOU CAN NO LONGER STOP ANYONE FROM POSTING..

You cant have the RIGHT to be forgotten.
You cant remove derogatory comments.
You cant Sue a person for anything he says, because of this.
You cant Sue Google/youtube/Bing/any service, because someone Crosslinked to a news article..
Porn away…back to bestiality sites.. Pain, and disgusting habits..

We couldnt Stop him from turning every advert into a Post for political status..

Do it ya idiot…pass this law, and everything you have done in the last 18 years is GONE…
The corps will love you. They wont need to run Multi million dollar Filters, the net will be that much faster.

Youtube and all the vieo storage on the Net will be faster… Kim Dot.com will Love you.. As he can say his site was/is for those with Opinions.

Russell Grub says:

NO law in America can RESULT in over-turning First Amendment.

Nope. Whether gov’t directly or attempting end-around by way of corporations (which is literally fascism): if RESULT is same then it’s forbidden by We The People.

Mere statute CANNOT empower corporations to become de facto censors of political views on the very "platforms" that Section 230 creates precisely so can publish without either gov’t or corporate A PRIORI censoring.

) The hosting corporations are made immune for what "natural" persons choose to publish. That does not mean the corporations are publishing, NOR that they’re forced to host what don’t want to associate with: it’s a DEAL to help The Public and corporations, not investing censoring power to the latter.

) Persons can still be liable for what they publish. But that’s their freedom.

) Corporations can in "good faith" remove content which is outside common law terms. They cannot make up their own definitions of "hate speech" by which to censor.

That is what Section 230 says. It DOES NOT empower corporations to arbitrarily censor.

Russell Grub says:

The latest citations show "platforms" are Neutral Public Forums,

where "natural" persons have First Amendment Rights, NOT money-machines for corporations having unlimited power to stifle us:

In the Sandvig v Sessions decision, from page 7 on, "A. THE INTERNET AS PUBLIC FORUM".

The discussion is businesses verus "natural" persons.

You’ll need to read the whole. Minion here clearly didn’t bother because refutes his premise.

(page 8) Only last Term, the Supreme Court emphatically declared the Internet a primary location for First Amendment activity: "While in the past there may have been difficulty in identifying the most important places (in a spatial sense) for the exchange of views, today the answer is clear. It is cyberspace…."

(page 9) The Internet "is a forum more in a metaphysical than in a spatial or geographic sense, but the same principles are applicable."

Key point: "the same principles are applicable." — Again, that’s applying to "natural" persons who in that case were accessing web-sites against TOS and corporate wishes, which of course is EXACTLY apposite to using forums and requiring them to be NEUTRAL.

The bottom-line question: Do YOU want to be SUBJECT to Corporate Control? — If so, just follow Masnick blindly, he’s leading you into the high-tech prison where you won’t have any "platform" to complain!

Anonymous Coward says:

Re: The latest citations show "platforms" are Neutral Public For

The bottom-line question: Do YOU want to be SUBJECT to Corporate Control?

No, I don’t! Which is why I’m very happy s.230 protects me from being sued by corporations if someone posts a defamatory comment about them on a small blog or forum I operate. Why are you so desperate to give corporations and the wealthy more power to censor the public via lawsuits?

Anonymous Coward says:

Re: Re: The latest citations show "platforms" are Neutral Public

Do YOU want one person with an axe to grind to be able to destroy someone’s reputation (or business) by weaponizing search engines?

Section 230 destroys an individual’s ability to protect their name or their business. Russians engage in “reputation blackmail.”

So you think Ripoff Report is a good thing then.

Anonymous Coward says:

Re: Re: Re: The latest citations show "platforms" are Neutral Pu

Do YOU want one person with an axe to grind to be able to destroy someone’s reputation (or business) by weaponizing search engines?

I assume you mean in a circumstance where said person is knowingly using libelous falsehoods to do so, rather than simply saying things that are true? In that case no, which is why I’m glad the person doing that is subject to defamation law.

Anonymous Coward says:

Re: Re: Re:2 The latest citations show "platforms" are Neutra

Except that the person isn’t causing the damage, the search engines are, the person could be judgment-proof, or in a country where they can’t be sued,. and they can be working for powerful entities or individuals who want to silence critics.

The potential to abuse made possible by Section 230 is obvious. All I said was that supporters of the law are placing the rights of big internet companies above the reputation of this class of individuals.

If search engines were smart they’d eliminate this concern by allowing this class of individuals to protect their reputations. If they don’t, it will likely be the undoing of section 230, likely when some beauty queen is Google-bombed by an ex-boyfriend and becomes the poster child for unaccountability.

Toom1275 (profile) says:

Re: Re: Re:3 The latest citations show "platforms" are Ne

Except that the person isn’t causing the damage, the search engines are, the person could be judgment-proof, or in a country where they can’t be sued,. and they can be working for powerful entities or individuals who want to silence critics.

Right, it isn’t the villain that’s causing the damage when he destroys that city, it’s the civilians’ susceptibility to death by nuclear fire that’s the root problem.

The potential for abuse is obvious.

Matthew Cline (profile) says:

Re: Re: Re:3 The latest citations show "platforms" are Ne

If search engines were smart they’d eliminate this concern by allowing this class of individuals to protect their reputations.

How, exactly? If you just allowed search engines to be sued for indexing defamatory stuff, they’d respond by just not linking to anything negative about anyone, expect for court decisions, because they can’t tell ahead of hand what’s defamatory or not. And that’s assuming that there’s some algorithmic way to determine potentially defamatory material from material which can’t defamatory.

If the solution is to have a law saying that search engines must take down links to links to negative material about John Doe if John doe claims that the material is defamatory, how do you prevent that from be abused by people who claim negative material about is defamatory when it’s actually true?

PaulT (profile) says:

Re: Re: Re:3 The latest citations show "platforms" are Ne

“Except that the person isn’t causing the damage, the search engines are”

So, you don’t know how search engine actually work? Figures.

” the person could be judgment-proof, or in a country where they can’t be sued”

So what? It being less convenient for you to attack the person doing the deed does not make it acceptable to go after innocent 3rd parties who happen to be easier to get to.

“The potential to abuse made possible by Section 230 is obvious”

Not nearly as obvious as the abuse that would happen without it.

“All I said was that supporters of the law are placing the rights of big internet companies above the reputation of this class of individuals.”

No, they also very much support individuals, but your Google hate boner is too strong to allow you to think about the real arguments.

Anonymous Coward says:

Re: Re: Re: The latest citations show "platforms" are Neutral Pu

So you think Ripoff Report is a good thing then.

I’d like you to take a moment here and think about what you’re saying: that if someone supports a law, they must believe that everything which it permits/enables is broadly "good".

I don’t think that’s a very tenable position.

I also support laws preventing police from kicking down any door they feel like without a warrant and searching houses on a whim. And yet, funny thing, that doesn’t mean I believe everything everyone does inside a house is "a good thing".

That One Guy (profile) says:

Re: Re: Re:3 Thanks for the laugh

I gotta say, probably the funniest part of your comments is that you think anyone, up to and including the owner of the site, would believe someone so grossly dishonest, a person who has a history of making unsupported claims/threats and then running away and/or changing the subject when asked to support them.

It’s like watching a habitual liar of a little kid threaten a full grown adult by claiming that their dad could totally beat them up, so they’d better do what the kid wants and take them seriously.

Anonymous Coward says:

Re: Re: Re: The latest citations show "platforms" are Neutral Pu

So your solution to people posting bad things online is to remove the protection of services that allow anyone to post online. You are one of the people, who if you get your way will prevent everybody else having nice things that enrich their lives.

Anonymous Coward says:

Re: Re: Re:

They can host whatever they want, doesn’t mean they have to have immunity.

Offline platforms don’t have immunity. Section 230 even immunizes websites against illegal housing and employment ads. It used to immunize sex-trafficking ads but no more.

Distributor liability for defamation has 150 years of precedent behind it. Libel laws were an alternative to DUELING.

Anonymous Coward says:

Re: Re: Re: Re:

They can host whatever they want, doesn’t mean they have to have immunity.

I understand what you are saying, but the original commenter in this thread has a more expansive position that they have established firmly over time: they believe that it is an illegal, unconstitutional violation of the first amendment for an internet platform to moderate user content. They assert that even under the s.230 immunity provisions as currently written, any platform that has (for example) a hate speech policy or a harassment policy is somehow in direct violation of the US constitution and should face serious penalties, and the government should force them to eliminate those policies.

Stephen T. Stone (profile) says:

Re: Re: Re:

They can host whatever they want, doesn’t mean they have to have immunity.

Can you prove the people who own and operate Twitter have created defamatory speech or have actively encouraged/helped someone publish defamatory speech? If not, for what reason should Twitter be held legally liable for defamatory speech written by a third party?

Anonymous Coward says:

Re: The latest citations show "platforms" are Neutral Public For

Sandvig v. Sessions addressed whether it is a 1A violation for congress to create a law that makes TOS violations a crime, or more specifically for courts to interpret and enforce the CFAA in that way.

It had nothing to do with whether websites are allowed to have TOS agreements, or whether they are allowed to enforce them and deny (or attempt to deny) service to anyone who violates them.

That One Guy (profile) says:

Ah politics...

Where the dumber you are the easier your job is, and you can feel safe in the knowledge that no-one will call your monumental stupidity to your face, allowing you to lie and/or make a fool of yourself all you want.

Idiots like this should never be allowed in positions of power, as you can be damn sure(as evidenced now) that they’ll take their stupidity and run with it, causing immense damage in their quest to look like they ‘stuck it to those dastardly companies’, despite the fact that everyone but the large companies he’s jousting against will get screwed over vastly more.

If there’s one silver lining to this idiot’s actions it’s that if by some disaster he does manage to cram his train-wreck of a law through I can guarantee it will not go the way he and the buffoons cheering him on think it will. If they think they’re being treated ‘unfairly’ now, when sites aren’t liable for what users post, they will not like what happens to them and theirs when sites are liable.

Aaron Walkhouse (profile) says:

Re: Re: Re: Whether or not you have a physical presence…

…only changes some methods of penalty and process.

Physical evasion does not negate laws or regulation.
Those DOING BUSINESS in any jurisdiction are subject
to it’s laws and regulations, no matter their location.

The Dotcom case is an example where Biden caused U.S.
authorities to ignore normal legal processes which could have
worked and tried to make an experimental criminal precedent.
That created the quagmire which outlived Biden’s VP career
and is likely to see him sued for costs once it collapses.

Leave a Reply to John Roddy Cancel reply

Your email address will not be published. Required fields are marked *

Have a Techdirt Account? Sign in now. Want one? Register here

Comment Options:

Make this the or (get credits or sign in to see balance) what's this?

What's this?

Techdirt community members with Techdirt Credits can spotlight a comment as either the "First Word" or "Last Word" on a particular comment thread. Credits can be purchased at the Techdirt Insider Shop »

Follow Techdirt

Techdirt Daily Newsletter

Ctrl-Alt-Speech

A weekly news podcast from
Mike Masnick & Ben Whitelaw

Subscribe now to Ctrl-Alt-Speech »
Techdirt Deals
Techdirt Insider Discord
The latest chatter on the Techdirt Insider Discord channel...
Loading...