As Intermediary Liability Is Under Attack, Stanford Releases Updated Tool To Document The State Of Play Globally

from the useful-stuff dept

We’ve spent many years talking about the issue of intermediary liability on the internet. While the term is one that nearly everyone agrees sounds boring as anything, it’s incredibly important in protecting your rights to express yourself online. The key issue is who is liable for speech that is posted online. The common sense reaction should be that “the speaker” is responsible for any speech they make online. However, for reasons I still don’t full comprehend, many, many, many people would prefer that the site hosting the speech should be liable. In many cases, this might not seem to matter. But it can actually matter quite a bit for absolutely everyone. While most speech is perfectly legal, there remain some exceptions (including copyright, defamation, true threats and more).

And while some people think that those exceptions are narrow enough that pinning liability on websites shouldn’t be a big deal, that’s not true in practice. Because if you say that the website (the intermediary or platform) is liable for the speech, then merely making an accusation of illegality in the speech has a high likelihood of censorship of protected speech. That’s because most platforms will take down speech that is reported in an attempt to avoid potentially crippling legal liability. Indeed, in many cases, platforms are then pressured (either by law or threat of laws or legal action) to pre-filter or moderate certain content just to avoid even the possibility of legal liability.

And because of that, lots of perfectly legitimate, protected speech gets blocked and censored. Much of this is abusive. Because once you’ve supplied a tool that allows someone to flag certain content for censorship, that tool gets used, even if the content doesn’t really qualify, and the internet platform is heavily incentivized to remove that content to avoid liability.

That’s why this matters so much. That’s why we’re so concerned at attempts to chip away at intermediary liability protections in the US, such as the immunity clause under CDA 230 or the safe harbor clause under the DMCA 512. But the US is, of course, just one country of hundreds. And lots of other countries have their own (frequently changing) laws on intermediary liability. For years Stanford’s Center for Internet and Society has hosted a World Intermediary Liability Map, and that map has just been updated. This is an incredibly thorough and useful tool in understanding how these laws play out in other countries, how they differ and even the impact of how they work.

With the updated version, you can also drill down on topic pages around specific types of liability regimes, such as looking at how the Right to be Forgotten has been spreading around the globe, or look at how intermediary liability is handled around the globe for copyright or look at the monitoring obligations imposed by various laws.

For those of us who continue to believe that proper intermediary liability laws are key to a functioning internet and freedom of expression online, this is a fantastic tool — only slightly marred by the fact that so many of the developments concerning intermediary liability (including here in the US) have been around successful attempts at chipping away from those principles, leading inevitably to greater censorship.

Filed Under: , , , , , ,
Companies: stanford

Rate this comment as insightful
Rate this comment as funny
You have rated this comment as insightful
You have rated this comment as funny
Flag this comment as abusive/trolling/spam
You have flagged this comment
The first word has already been claimed
The last word has already been claimed
Insightful Lightbulb icon Funny Laughing icon Abusive/trolling/spam Flag icon Insightful badge Lightbulb icon Funny badge Laughing icon Comments icon

Comments on “As Intermediary Liability Is Under Attack, Stanford Releases Updated Tool To Document The State Of Play Globally”

Subscribe: RSS Leave a comment
28 Comments
Daydream says:

Re: Re: Better yet…

I live next to a secondary school; the road in between my house and said school is actually on a slight hill.
If you stand at the top of the hill, you can see cars coming both ways, but if you’re standing to one side or the other, you can’t see cars coming until they’re one or two seconds away from running over you.

Guess where the pedestrian crossing is? Not at the top of the hill.

So there’s your argument for bad road design contributing to potential accidents and why the government should be liable.

Anonymous Coward says:

>Indeed, in many cases, platforms are then pressured (either by law or threat of laws or legal action) to pre-filter or moderate certain content just to avoid even the possibility of legal liability.

That could, and probably would severely limit the amount of speech published on the Internet, especially as anything that raised the slightest question would be dropped or lost in the ever growing moderation queue.

That One Guy (profile) says:

"Well yeah, but the light's better over here."

However, for reasons I still don’t full comprehend, many, many, many people would prefer that the site hosting the speech should be liable.

I suspect it’s a mix between having an easier target, offloading the work onto someone else, deniability when it comes to censorship, and not having an easy ‘fix’ if they had to go after the actual speaker.

A big company tends to be a lot easier to get a hold of, and a lot more visible, whereas going after an individual would require personal data that might not be available immediately, and would likely require that you convince the site that you have solid grounds to be given that information, which may very well require taking the matter to court.

By offloading the liability to the platform you also offload the work finding ‘objectionable’ material, which means all the work is on their end, you can blame them if they don’t get ‘everything’, and any claims of censorship can be laid at their feet should anyone object, rather than the one who is demanding/’persuading’ that the content be taken down.

Lastly, for content that is ‘objectionable’ but not actually illegal or in violation of a site’s ToS, if you had to deal with the speaker directly you’re not likely to have much luck finding an excuse as to why the content should be pulled. If you make the site responsible though, and make it clear that if they don’t do a ‘good enough job’ then they’re in trouble, they are very likely to go overboard and remove even questionable stuff ‘just in case’, likely taking care of the ‘problem’ without you having to do a thing.

Anonymous Coward says:

Re: "Well yeah, but the light's better over here."

Several years ago I read a summary of a case where a judge allowed a lawsuit to go forward against an intermediary provider specifically because the actual speaker likely wouldn’t have any money to pay damages while the provider would.

That One Guy (profile) says:

Re: Re: "Well yeah, but the light's better over here."

I dearly hope you’re misremembering that one, the idea that the judge allowed a lawsuit against a party that was only marginally involved simply because they had more money is insane. They might as well have flat out said ‘Profit is more important than sound legal footing’ with a ruling like that.

Uriel-238 (profile) says:

Re: Re: Re: "Profit is more important than sound legal footing"

That actually sounds like typical judge logic here in the US.

Other favorites are:

Screw defendant rights if the crime is awful enough

Screw defendant rights if it’s close to a border

Screw defendant rights if he’s black / a terrorist / a pedophile / I don’t like him

The loot siezed is high value? Rightful forfeiture

OMG Good faith exception? well that changes everything!

The defendant is law enforcement? Acquitted!

The crime is too complicated to understand? Guilty!

Judges in the US are commonly biased and or idiots. What I can’t tell is if they’re typically biased or idiots.

The Wanderer (profile) says:

Re: "Well yeah, but the light's better over here."

That certainly covers a lot of it, but I’m not so sure it’s the root of the reason in many cases.

I suspect that many people are working from the (probably implicit and subconscious) idea that “by agreeing to host the content – either when you know what the content is in advance, or by continuing to host it after you learn what it is – you are speaking that content yourself, and therefore you are the speaker, and can be held liable for the speech”.

That is, it’s not that they’d object to the original speaker being liable – they just consider anyone who chooses to cooperate / collaborate in the act of speaking, such as someone who chooses to host the speech, to be equally responsible for what is said.

That’s why that whole tangle of ideas including terms like “red-flag knowledge” and “knew or should have known” develops in the first place.

The result of all of that is the setup you describe, but I think that setup is all based on this deeper root.

Anonymous Coward says:

It would seem that, unfortunately, the laws regarding intermediary liability are probably irrelevant in most cases.

A familiar occurrence goes something like this:

A small independent site owner gets an email saying that a posted comment is not just incorrect, but is libelous, defamatory, or whatever else, and demands that it immediately be taken down under the threat, whether actual or implied, of a potentially expensive lawsuit. So the comment naturally gets deleted as a purely economic and/or survival decision.

I’ve had product reviews taken down this way, and I had no hard feelings toward the site owners for doing what was necessary to survive. Large sites that have the money to fight bogus lawsuits can be just as risk-averse, however, as a more bottom-line focused corporate mentality replaces the more ethically or ideologically-charged principles of the random young guy running a site, someone who passionately hates bullies but knows his limitations and recognizes the risks of getting sued, especially by someone with deeper pockets.

Anonymous Coward says:

“And because of that, lots of perfectly legitimate, protected speech gets blocked and censored. Much of this is abusive. Because once you’ve supplied a tool that allows someone to flag certain content for censorship, that tool gets used, even if the content doesn’t really qualify, and the internet platform is heavily incentivized to remove that content to avoid liability.”

Because their current goal is back-handed demonetization on an ever-increasing wholesale amount of content, backed by cheerleaders who fool nobody (cheerleaders like you) it is ALWAYS ABUSIVE in the hands of Google and it’s subsidiaries, primarily YouTube.

:Lobo Santo (profile) says:

I think we're becoming Cardassia...

(a fictional planet & race from Star Trek: Deep Space 9)

I specifically remember a scene where the Doctor says to Garak, “I’m sick of these Cardassian mystery novels. There’s no mystery, everybody is always guilty!”.

Garak replies: “Of course, Doctor, the mystery is: who is guilty of what?”

– – – – – – – – – – – – –

Can you imagine the world we’ll create if this becomes common and normal? Nobody will start a new internet business unless they can already afford cutting-edge censorship techniques and an army of lawyers.

Anonymous Coward says:

Re: I think we're becoming Cardassia...

and having NO NEW INTERNET COMPANIES is exactly what one industry (three guesses which one, and the first 2 don’t count) wants, and this whole ‘liabilty’ shuffle is just the first step in this cold war (and make no mistake, we have been at war with USAA* since the first recording Mr Bell made.)

It’s a war of attrition and obfuscation and they are doing an excellent job of moving the goal posts every time we get close, ooh look over there a shiny… Did something happen while we were looking the other way (probably but we won’t know for a while yet).

Anonymous Coward says:

‘for reasons I still don’t full comprehend’

surely the reason is because the sites can easily be found, as can, in most cases, those running and acting as admins on the sites whereas the individuals who are posting on the sites are much more difficult and sometimes, damn near impossible to find! there’s nothing easier than being able to blame joe, if he can be found, regardless of whether he has done anything wrong or not, than finding xy3 who is hiding somewhere! on top of that, if blaming joe means he is told to do as something, refuses and has to go to court, the cost will probably shut the site anyway so it’s a win-win situation for those who want to impose their will on the site and will do the same, eventually, everywhere!

people seem to be oblivious as to what is actually going on, world-wide, where the rich, the famous, the powerful few and their friends are actually taking control of everything, by locking things up, locking us out and preventing us from knowing what the fuckers are up to, while knowing the ins and outs of everything to do with us, 24/7!!

Contrarian says:

But you're assuming all platforms are just passive hosts

Website #1: http://www.findahitman.com. Make an account, pay us $20, and we’ll provide you a list of professional hit men. Hell, we may even make the introduction.

Website #2: hitmentruecrimestories.com. Come on our site and post your stories about famous hit men and interact with those who share your passion.

Same type of platform?

That One Guy (profile) says:

Re: But you're assuming all platforms are just passive hosts

Different types of platforms. Laws that shield against intermediary liability, like 230, don’t do squat if the platform is deliberately involved in something illegal.

If the site is actively involved, rather than passively, then they can be held liable.

As for your two examples #1 would probably not be protected, as that seems to be actively engaged in facilitation of illegal activity(assuming it’s not a honey-pot anyway), whereas #2 would be, as the description you list seems to suggest it’s just a site to share stories, which isn’t illegal even if the activity described is.

Add Your Comment

Your email address will not be published. Required fields are marked *

Have a Techdirt Account? Sign in now. Want one? Register here

Comment Options:

Make this the or (get credits or sign in to see balance) what's this?

What's this?

Techdirt community members with Techdirt Credits can spotlight a comment as either the "First Word" or "Last Word" on a particular comment thread. Credits can be purchased at the Techdirt Insider Shop »

Follow Techdirt

Techdirt Daily Newsletter

Techdirt Deals
Techdirt Insider Discord
The latest chatter on the Techdirt Insider Discord channel...
Loading...