How Free Speech Online Is Enabled By Not Blaming Websites For The Actions Of Their Users

from the save-section-230 dept

We've written many, many times about the importance of Section 230 of the Communications Decency Act, which provides protections against secondary liability (except for intellectual property) claims for internet service providers. In the simplest form, Section 230 says that if a user does something on a site that's illegal, you need to blame the person who did it, rather than the site. As we've said repeatedly, this makes so much sense it's almost troubling that you need a law to make the point. You blame the person who broke the law, not the tools they used. Duh. You don't blame Ford because that was the getaway car in a bank robbery and you don't blame AT&T because the extortion call was made via an AT&T phone line.

But, on the internet, blame often gets thrown around wildly, and websites themselves are too frequently targets of misguided legal claims. Section 230 protects against that and lets companies quickly extricate themselves from bogus legal claims. Unfortunately, Section 230 has been under attack lately by a number of parties. There are a bunch of state Attorneys General who are looking to change the law to exempt it from applying to cases they might bring against sites. You have short-sighted law professors who think that a way to fix "bad" sites online is to gut Section 230. And, unfortunately, you have one really bad district court ruling (which disagrees with every other ruling on the law) which hopefully will get reversed on appeal.

Much of the academic discussions concerning the importance of Section 230 have, quite reasonably, focused on the impact on innovation. As some have pointed out, Section 230 can, in many ways, be credited with "giving us the modern internet." So many of the sites you know, love and use today probably wouldn't exist, or would exist in greatly limited forms, without Section 230. It's truly enabled a revolution in innovation online, allowing sites to build powerful tools for the public, without having to be liable for how the public then uses those tools.

And that brings us to a second important point about Section 230 that perhaps gets less attention, though it probably should get much more. Section 230 has really helped to enable a tremendous amount of free speech online. On a recent edition of his Surprisingly Free podcast, Jerry Brito interviewed Anupam Chander who recently co-authored a paper (with Uyen Le), which highlights "the free speech foundations of cyberlaw" and how Section 230 has been truly key in guaranteeing a lot of free speech online -- basically highlighting how the First Amendment and Section 230 work well together. However, it also points out that it was the First Amendment that underpins all of this, and that we should be wary of challenges to the law that might undermine the First Amendment.

If you know the history of the Communications Decency Act, you know that it was a very, very different bill than Section 230. The original CDA was basically the opposite. It was a bill to censor websites that had "indecent" communications. Most of that got thrown out (after quite the legal fight) as unconstitutional under the First Amendment. What remained, and should continue to remain, is Section 230 -- which was the one part of the bill that was consistent with the First Amendment.

The paper also goes into how copyright law, separately, has been trying to chip away at the First Amendment online, and how dangerous that is as well. Remember, Section 230 doesn't apply to copyright law, though there is Section 512 of the DMCA, which is similar, but not nearly as strong. The paper notes that things like SOPA really were a direct attempt to attack the First Amendment, which is why it's a good thing that it failed.

All in all, it's a good reminder of both how important protecting free speech online is today and how fragile it is, since the laws that have made it possible are so constantly under attack.


Reader Comments (rss)

(Flattened / Threaded)

  1.  
    identicon
    Marak, Feb 14th, 2014 @ 8:32pm

    I believe its a case of people looking at narrow issues instead of the larger implications.

    At least i hope that humanity hasnt gotten that retarded and wants to break things they are afraid of.

     

    reply to this | link to this | view in thread ]

  2.  
    identicon
    Just Sayin', Feb 14th, 2014 @ 9:04pm

    Good arguments, but...

    The arguments made are good, and it's all fine and dandy, but translated to the real world (deleting the old "on the internet" clause), you might consider differently.

    If I have a billboard company that allows anyone to put up any advertising and entirely refuses to take it down, even if it's obscene, abusive, illegal, or slanderous... what would happen? Clearly as the billboard company I would be in a position to stop such speech. In fact, I am a key part of the process of making that speech. Without the billboard, the speech (good or bad) would not be made in that manner.

    So the billboard owner isn't just a passive "service provider", but more of an enabler, one who aids in making such speech. The result is that advertising companies of all sort (and any other publishing company) has certain responsibilities under the law to deal with such issues.

    Section 230 is wonderful, except that it grants a level of protection above and beyond what would be found in the real world. If you anonymously put up slanderous posters on every light pole in town, you can be sure that someone would work to figure out who put them up, and take you to court for it. Heck, you might even end up with a criminal case (harassment) to deal with as well.

    A website owner has some responsibility to deal with what is on their site. Most take this for granted at least to some extent, as an example I don't think we can post child porn or death threats here in the comments on Techdirt. Like it or not, you have ignored section 230 to do what is right, morally and legally. Having crossed that line already for CERTAIN illegal or distasteful acts, why would you think that you have no liability for the rest of them?

    Section 230 would work and get supported by all sides if it didn't create a special "on the internet" class which allows users to hide behind a website or ISP's skirt. It creates a hole where the website / service provider won't take action, and at the same time impedes access to the information required for an injured party to take action. It creates a situation where the abuser gets a double helping of protection, and the abused party gets told to pound sand.

    The hope should be that over time, the internet remains free with a little sideline of "but you don't get more rights here".

     

    reply to this | link to this | view in thread ]

  3.  
    icon
    ECA (profile), Feb 14th, 2014 @ 11:07pm

    What would happen to TV

    Think about this for a few seconds..

    WHAT would TV channels do, if they were liable for commercials and TV programs??

    HOW many Pharmacy commercials would be WIPED off of TV??
    How many Money loan commercials?

    Fox news? GONE..

     

    reply to this | link to this | view in thread ]

  4.  
    identicon
    schizoid, Feb 15th, 2014 @ 1:10am

    I remember the CDA battle. Back then we didn't need section 230 because discussions took place on Usenet, which was pretty much uncensorable.

    Then web forums were created, giving the censors targets to attack. Everything went downhill from there.

     

    reply to this | link to this | view in thread ]

  5.  
    icon
    That Anonymous Coward (profile), Feb 15th, 2014 @ 2:32am

    Re: What would happen to TV

    It was the last one that made me give up my sanity and decide to support this stupid idea.

     

    reply to this | link to this | view in thread ]

  6.  
    identicon
    Anonymous Coward, Feb 15th, 2014 @ 2:34am

    Remember when the greatest threat to the Internet was some asshole who made money off of other people's hard work and slapped watermarks on them, as opposed to The State itself?

    Why can't we go back to that?

     

    reply to this | link to this | view in thread ]

  7.  
    identicon
    Anonymous Coward, Feb 15th, 2014 @ 3:17am

    Response to: Anonymous Coward on Feb 15th, 2014 @ 2:34am

    That was never really a threat to the internet itself so it would indeed be nice to go back to that.

     

    reply to this | link to this | view in thread ]

  8.  
    identicon
    Anonymous Coward, Feb 15th, 2014 @ 4:31am

    Re:

    At least i hope that humanity hasnt gotten that retarded and wants to break things they are afraid of.

    Unfortunately the bad habit of breaking things that they are afraid of has deep roots, and is the basis of tribalism. To a large extent it is the habit of humanity that is harnessed by those who seek power as king, emperor, pope, prophet etc.

     

    reply to this | link to this | view in thread ]

  9.  
    icon
    ECA (profile), Feb 15th, 2014 @ 6:42am

    Re: Re: What would happen to TV

    Understand it though..
    Being able to SUE anyone for having a different opinion?

    Having to prove Facts?

    1/2 the Doctor shows,
    MOST of the COP shows,
    Dont want religion on TV, sue them..
    Only thing on TV would be FACT or REAl fantasy..

    MORE Cartoons..

     

    reply to this | link to this | view in thread ]

  10.  
    icon
    btrussell (profile), Feb 15th, 2014 @ 7:54am

    "...since the laws that have made it possible are so constantly under attack."

    Laws prohibit, not enable.

    Lets pretend there are no laws. Now make one law that allows me to do something that I can't already do.

     

    reply to this | link to this | view in thread ]

  11.  
    identicon
    Anonymous Coward, Feb 15th, 2014 @ 9:58am

    It is laws that enables big companies to sue individuals, as suing is a concept that came from human minds and does not exist in a natural enviroment.

     

    reply to this | link to this | view in thread ]

  12.  
    icon
    madasahatter (profile), Feb 15th, 2014 @ 11:09am

    Re: Re:

    There will be moral panics in the future. Most current moral panics have their roots in actions of a few idiots with a computer. While some of the behavior is reprehensible, it has to be tolerated if free speech means any thing.

    Also, there is a recent tendency to blame the owner for the behavior of others, which the owner has limited, if any, control.

     

    reply to this | link to this | view in thread ]

  13.  
    identicon
    Anonymous Coward, Feb 15th, 2014 @ 11:18am

    Re:

    ... and it is laws that prohibit individuals from suing big companies.

     

    reply to this | link to this | view in thread ]

  14.  
    icon
    btrussell (profile), Feb 15th, 2014 @ 12:13pm

    Re:

    The corporation isn't being enabled, the individual is being prohibited.

     

    reply to this | link to this | view in thread ]

  15.  
    identicon
    Anonymous Coward, Feb 15th, 2014 @ 12:28pm

    and by allowing people to do exactly that, unlike the UK which claims to support 'Free Speech' but is now actively involved in Internet Censorship, a slippery slope which will be very difficult to get off but will also hopefully aid in the demise of the present government!! that is if it isn't told to remove the blocks first by the EU when the UK is taken to court in the very near future over the GCHQ Spying!

     

    reply to this | link to this | view in thread ]

  16.  
    identicon
    zip, Feb 15th, 2014 @ 2:06pm

    toxic reader comments

    Many Indymedia.org authors over the years have been raided by police (a few convicted) because of a comment posted by an anonymous person that they obviously had nothing to do with. (Indymedia intentionally does not log IP addresses)

    It is, sadly, the de facto responsibility of webmasters and bloggers to closely monitor all comments and quickly delete any that might result in a knock on their door -- or worse-- by police or federal agents. This is especially true if the blogger tends to write about controversial subjects, especially with an attitude that angers law enforcement authorities or people/"legal-persons" that influence them. Let's not forget that it's one thing to be in legal compliance, but entirely another to attract unwanted attention from law enforcement despite your legal compliance, and not surprisingly, few people want to become the latest martyr by exercising their legal rights to the fullest.

    People who run websites of a controversial nature tend to develop a habit of vigorously patrolling the comment section and reflexively deleting the more extreme or inflammatory remarks (and banning the user, sometimes even back-deleting all previous comments) since experience has taught them they were likely put there by an agent provocateur trying to get them in trouble.

    Frankly, it's almost shocking that Techdirt still has a very liberal reader-comment policy after being online so many years. Although most sites start out adhering to the concept of 'free speech' (minus the obvious, univeral no-no's of course), this gradually erodes as more and more restrictions are put in place -- at least that's been the case in every site I've ever followed. Power and control being an ever-present human temptation, I can understand the site owner's (apparent) point of view: reader comments become increasingly time-consuming to monitor, tiresome to read, and the liability danger more acknowledged (and fear is a BIG motivator in overriding a person's sense of -and adherence to- ethics and principle.)

     

    reply to this | link to this | view in thread ]

  17.  
    identicon
    Just Sayin', Feb 15th, 2014 @ 5:20pm

    Re: toxic reader comments

    " it's almost shocking that Techdirt still has a very liberal reader-comment policy after being online so many years"

    Techdirt intentionally holds comments for moderation when they come from people they don't like. Basically, while Mike talks a good game about freedom of speech, he clearly isn't beyond manipulating it to make himself look better. Techdirt as a whole dislikes any viewpoint that doesn't echo whatever the trend of the month is on this site, and Mike and his moderators are very careful to keep things all uniform and positive on the messages presented.

    It's the snowy white world of secret, quiet, and discreet management of viewpoints by blocking those they don't agree with. Some would call it censorship, but they swim away by allowing posts to finally appear days after they are posted, when people are no longer actively reading the thread. It's pretty sneaky, even for Techdirt.

     

    reply to this | link to this | view in thread ]

  18.  
    icon
    John Fenderson (profile), Feb 15th, 2014 @ 5:56pm

    Re: toxic reader comments

    "It is, sadly, the de facto responsibility of webmasters and bloggers to closely monitor all comments and quickly delete any that might result in a knock on their door"

    No, it's not the responsibility, de facto or otherwise. Sites that do this are simply choosing not give up freedom in exchange for (apparent) safety. That is their right, and in some cases it may even be understandable, but it is not a responsibility.

    "Frankly, it's almost shocking that Techdirt still has a very liberal reader-comment policy after being online so many years."

    Not really. Such policies are still not rare amongst the better sites. Sites that do delete comments for fear of a "knock on the door" are sites that don't have comment sections worth reading.

    Note that I differentiate between deleting comments for fear of attention from the authorities and deleting comments because they are against the stated policies of the site in question. Enforcing an editorial policy is fair, acting out of fear is just cowardly.

     

    reply to this | link to this | view in thread ]

  19.  
    identicon
    Anonymous Coward, Feb 16th, 2014 @ 4:40am

    Re: What would happen to TV

    Tv stations view and approve commercials all the time. Same for programming and even infomercials. They actually do take responsibility for the content of their users.

     

    reply to this | link to this | view in thread ]

  20.  
    identicon
    Anonymous Coward, Feb 16th, 2014 @ 5:12am

    Re: Re: What would happen to TV

    Not if the payment is good enough, just look at big pharma and the recalls on medications that kill, If big cable was responsible they would be issuing statements on primetime.

     

    reply to this | link to this | view in thread ]

  21.  
    identicon
    Anonymous Coward, Feb 16th, 2014 @ 5:57am

    Re: Re: What would happen to TV

    TV stations, publishers, labels and studios only deal with a small number of users, and can therefore vet everything that they output. Twitter, you-tube and major bloging sites have more contributions that they can possibly employ staff to vet in a timely manner.
    The big advantage of the Internet is that creative people can find an audience without having to convince a publisher to carry their work. This is enabling many more authors, musicians artists etc. to find an audience, and much of this work would not be available to the public if it had to be approved by a third party.
    Requiring that all content is approved by an editor is an insidious form of censorship, in that it requires that a third party, usually an editor, decide which created works will be offered to the public.

     

    reply to this | link to this | view in thread ]

  22.  
    icon
    Gwiz (profile), Feb 16th, 2014 @ 6:37am

    Re: Re: What would happen to TV

    Tv stations view and approve commercials all the time. Same for programming and even infomercials. They actually do take responsibility for the content of their users.


    TV stations also "opt" out of that responsibility by broadcasting a bit of legalese stating that "the following is the view and opinions of others and not necessarily the views or opinions of this station or it's owners".

    Section 230 does that as the blanket default for websites. How is that any different from what the TV stations do if you omit the opt-in part?

     

    reply to this | link to this | view in thread ]

  23.  
    icon
    Marc John Randazza (profile), Feb 16th, 2014 @ 7:51am

    Lets not totally close our minds...

    One exercise I like to engage in is to take something that I believe in dearly, and challenge myself on that belief.

    Section 230 is one of those beliefs. When I look back at all the Section 230 cases that I have both handled and researched, it seems to me that it might be superior from a policy perspective to have some kind of notice and responsibility provision. If an OSP receives a notice that the content is somehow violative of the complainant's rights, then the OSP can either a) take it down, or b) accept legal responsibility for the content.

    I would, however, suggest that it would not be that simple -- as there should be prevailing party attorneys' fees on both sides - so if the OSP takes responsibility and gets sued, the content had better be actually violative of some legal right. I'd also like to see a provision that the author would receive statutory damages (payable by the complaining party) if the content was taken down due to a bogus complaint. And perhaps even the right of the OSP to bill the complaining party for its fees in having a lawyer review the complaint, whether the complaint is bogus or legitimate (after all, the OSP should not be taxed with the cost of a letter writing campaign by someone with censorious intent).

    Section 230 has been a good engine for development of countless services, and that is a good thing. Unfortunately, it has also been a good engine for harmful arrogance on the part of a lot of 650/415 area code businesses.

    There are companies that exercise responsibility - in my experience, Automattic is pretty protective of its users' rights, but it will not simply cover its ears and say "Section 230, Section 230, Section 230" when concerns are brought to its attention. I've represented a number of companies that have internal responsibility policies that make me proud to represent them. I have Sect 230 clients who don't give a shit too, and for as long as that is the law, then I'll defend their right to not give a shit until they tell me to relent.

    But, the problem with Section 230 is that it has become a license to not give a shit. The more irresponsible, the more profitable. That's not really desirable.

     

    reply to this | link to this | view in thread ]

  24.  
    identicon
    Anonymous Coward, Feb 16th, 2014 @ 8:32am

    Re: toxic reader comments

    "webmasters and bloggers to closely monitor all comments and quickly delete any that might result in a knock on their door"

    I imagine there are sites that patrol their comment sections out of fear. However, it is probably more prevalent that sites patrol their comment sections because they engage in censoring things they do not agree with.


    "reader comments become increasingly time-consuming to monitor, tiresome to read ..."

    Or, they tire of the additional overhead going to legal council. Has anyone done a study of the affect upon traffic (ad revenue) when the comment section(s) are stifled or even completely removed?

     

    reply to this | link to this | view in thread ]

  25.  
    identicon
    Anonymous Coward, Feb 16th, 2014 @ 8:49am

    I thought that "corporations don't give a shit" was covered in BUS-101. Oh, I'm sure there are some companies that strive to do the "right thing" but aren't these usually privately held?

    Corporations have their Mission Statements and claims of being community members but when it comes to the bottom line, money matters most. I would not lay the blame for this at the feet of section 230.

     

    reply to this | link to this | view in thread ]

  26.  
    icon
    That One Guy (profile), Feb 16th, 2014 @ 8:53am

    Re: Lets not totally close our minds...

    Not sure that would be an improvement, seems that having a 'take it down or be legally responsible for it' way of handling user content would bring about a repeat of one of the big problems with the DMCA system, where there's plenty of incentive to pull the content, but not much incentive to keep it up.

    Now, the attorney's fee provision would help that a bit, but even with a guarantee of recovering all the costs you incurred going to court, or even just the costs of a lawyer should it not get that far, heading to court would still be a major hassle, and one a lot of sites, and likely all smaller sites with user created content would rather avoid, leading to said content likely being pulled at the first sign of trouble.

    I think the current system is probably the best way to handle it, if someone has a problem with content on a site, they go after the one who put it there, and afterwards, should they win in court, then they can present the win in court to the site owner and have the content pulled, so the site itself is only involved at the very end, and has minimal work to do.

     

    reply to this | link to this | view in thread ]

  27.  
    icon
    Marc John Randazza (profile), Feb 16th, 2014 @ 9:09am

    Re: Re: Lets not totally close our minds...

    I think you may have very little experience with the legal system. I've defended cases on this very principle -- "go after the person who posted it." I've been paid to support the Section 230 arrogance. And, for as long as it is the law, then I'm fine doing so.

    But, I can't say that I don't have personal feelings of sympathy for the poor bastards trying to deal with that. Did the original poster use Tor? Tough shit for you then. Is the original poster judgment-proof? Tough shit for you then.

    And, if someone does wind up proving that the content violates some right, they do that after five-to-six figures in attorneys fees, and two years of fighting. Meanwhile, the OSP could have made a judgment call.

    Now sometimes they do. I've represented Sect. 230 businesses who would act responsibly, and take clearly violative content down. I've also represented aggrieved parties, and sent Sect. 230 businesses letters stating "I know you're protected by Section 230, but here's why this should come down..." and I have had good results.

    The current system is very nice for me, Google, and Facebook. I love the money I make defending Section 230. Google loves having as much content as it can, without giving half a shit about anyone else. But, to think that the current system is the best one is to think quite narrowly.

     

    reply to this | link to this | view in thread ]

  28.  
    identicon
    Anonymous Coward, Feb 16th, 2014 @ 10:30am

    Re: Re: Re: Lets not totally close our minds...

    In your "the system works" world there would be no corruption and all ne'er-do-wells would get their comeuppance. Awesome.

    However, there is no such thing. Whatever is put in place will be gamed and those that censor will have free rein with little recourse.

     

    reply to this | link to this | view in thread ]

  29.  
    identicon
    Anonymous Coward, Feb 16th, 2014 @ 10:39am

    Re:

    you know it, i still use usenet for binaries but 95% of text newsgroups are now dead/spammed to death

     

    reply to this | link to this | view in thread ]

  30.  
    identicon
    Anonymous Coward, Feb 16th, 2014 @ 10:46am

    Re: toxic reader comments

    I had the mods delete a few "extremist" things like well, saying "such and such politicians should be decapitated right now"...1) I am against capital punishment, better to have the rotten rot away in a hole thats going to rot their minds away and experience true hell, when deserved, than giving them the gift of a swift death. 2) Attracts bad people like you said and this is why we can't have nice things.

     

    reply to this | link to this | view in thread ]

  31.  
    identicon
    Larry, Feb 16th, 2014 @ 12:48pm

    Section 230

    One of the biggest benefits to Section 230 is that it tends to stifle states from passing content liability laws.

     

    reply to this | link to this | view in thread ]

  32.  
    identicon
    ryuugami, Feb 16th, 2014 @ 1:47pm

    Re: toxic reader comments

    they were likely put there by an agent provocateur trying to get them in trouble.

    Frankly, it's almost shocking that Techdirt still has a very liberal reader-comment policy after being online so many years.

    Well, the agents provocateur we get here are OOTB&co., so really, it's not shocking at all.

     

    reply to this | link to this | view in thread ]

  33.  
    identicon
    Anonymous Coward, Feb 16th, 2014 @ 10:05pm

    Re: Re:

    Whined the spambot.

     

    reply to this | link to this | view in thread ]

  34.  
    icon
    PaulT (profile), Feb 17th, 2014 @ 1:28am

    Re: Good arguments, but...

    "Clearly as the billboard company I would be in a position to stop such speech."

    I disagree. The company can only reactively remove the "speech" from their property. They wouldn't have control over who comes there to post anything on there. So, they can't "stop" a damn thing, they can only try to make sure that anything unauthorised is not seen for long. You can perhaps argue that they're responsible if they refuse to take it down, but they sure as hell aren't liable for the "speech" being present in the first place.

    It's a bad analogy, anyway. What billboard company allows people to use the finite space that's their entire business to be overrun by people posting their own stuff for free. In this case, the "on the internet" part is absolutely vital, since it changes the whole concept once you're not dealing with finite. expensive real estate.

    You're basically arguing that the owner of a wall should be liable for the content of the graffiti painted on it. It's idiotic, short-sighted and will achieve nothing if enforced, since the owner can only ever react to something being posted. Unless you're arguing for people to have to screen everything ever posted, which ironically probably means I wouldn't have been able to real this comment of yours and reply to it - at least not yet since the volume of comments being screened by a human being would cause ridiculous delays.

     

    reply to this | link to this | view in thread ]

  35.  
    icon
    PaulT (profile), Feb 17th, 2014 @ 1:35am

    Re: Re: What would happen to TV

    Because they're a broadcast medium that pre-screens a tiny amount of content that's only ever screened sequentially, one program at a time. Not serving thousands or millions of pages simultaneously at users' request with content over which they have no proactive control.

    It's a horrible analogy anyway, since advertising complaint authorities exist and are regularly upholding complaints from the public. So - by your own example - even if sites screened everything ever posted, they would probably still be held liable for wrongdoing of others.

    I do, however, find it amusing that an anonymous coward is arguing for controls that would probably remove their right to post as such. Seems to me someone hasn't thought this through.

     

    reply to this | link to this | view in thread ]

  36.  
    icon
    PaulT (profile), Feb 17th, 2014 @ 1:39am

    Re: Re: toxic reader comments

    "Techdirt intentionally holds comments for moderation when they come from people they don't like."

    No, they hold comments that come from spamming or trolling idiots whose previous comments have regularly been reported as such by other users. If you're one of the people who suffers from moderation on a regular basis, you might want to read your comments and work out why. Unless they just contain lots of links, of course, in which case they're usually moderated and most of us (including myself) have had that happen. They DO get approved if they're acceptable, however.

    In short, stop lying and you'll stop getting filtered - although I find it amusing that *this* is the level of censorship you find unacceptable. You don't get out much, do you?

     

    reply to this | link to this | view in thread ]

  37.  
    identicon
    Anonymous Coward, Feb 17th, 2014 @ 1:39am

    Re:

    If this site was cleaned up you would be nowhere to be found, fuckwit.

     

    reply to this | link to this | view in thread ]

  38.  
    icon
    Marc John Randazza (profile), Feb 17th, 2014 @ 6:23am

    Re: Re: Good arguments, but...

    You wrote:

    You're basically arguing that the owner of a wall should be liable for the content of the graffiti painted on it.

    Is that such a bad standard?

    I agree that it would be absurd to have strict liability for the owner of the wall to be liable for graffiti painted on it. But, lets agree that the graffiti is somehow violating a third party's rights.

    Then, I think nobody would say the wall owner is liable at the moment that the graffiti is placed there. After all, he had nothing to do with creating the graffiti, and probably didn't even want it there.

    But, would you argue that he should never become liable? If he gets a letter from the subject of the graffiti, bringing it to his attention? If it is easy to remove? And, yet, the wall owner decides that he just doesn't care? At some point wouldn't you want the wall owner to have any responsibility at all?

     

    reply to this | link to this | view in thread ]

  39.  
    icon
    Gwiz (profile), Feb 17th, 2014 @ 6:59am

    Re: Re: Re: Lets not totally close our minds...

    But, I can't say that I don't have personal feelings of sympathy for the poor bastards trying to deal with that. Did the original poster use Tor? Tough shit for you then. Is the original poster judgment-proof? Tough shit for you then.


    How does this fit into the long standing legal tradition of treating anonymous speech as protected by First Amendment? From your tone in what I quoted is sounds like you may wish to erode that protection, but I'm not sure, so I'm asking.

     

    reply to this | link to this | view in thread ]

  40.  
    identicon
    Anonymous Coward, Feb 17th, 2014 @ 7:11am

    Re: Re: Re: Good arguments, but...

    " somehow violating a third party's rights."

    And this will never be in dispute, corporations will always be correct in their assertion of being violated whereas the common folk will always be incorrect when they claim violations. Furthermore, simply stating this will become a violation of someone's rights. Off to a for profit prison you insulter of Kings.

     

    reply to this | link to this | view in thread ]

  41.  
    icon
    Marc John Randazza (profile), Feb 17th, 2014 @ 7:11am

    Re: Re: Re: Re: Lets not totally close our minds...

    That's a good question: I agree that you have a First Amendment right to speak anonymously. However, if you look at the line of cases dealing with this right, (Dendrite, Cahill, etc.) the right is not absolute. (Nor do I believe it should be absolute). Nevertheless, the cases (and I) agree that the aggrieved party must put forth some showing of liability for the speech, before the speaker may be legally unmasked.

    To answer your question, I do not wish to erode the protection for anonymous speech. I think that the balance we have is pretty good -- when the courts actually follow it. I think a very good case articulating the standard is Krinsky v. Doe 6: http://www.dmlp.org/sites/citmedialaw.org/files/2008-02-06-Krinsky_v._Doe_Opinion.pdf

    The bottom line is, if the speech is truly actionable (and shows that before revealing the speaker's identity), then the speaker should not be able to evade liability only because the speaker manages to hide. On the other hand, I do not think that plaintiffs should be able to unmask anonymous speakers without making a showing that the speech is actionable, because First Amendment.

     

    reply to this | link to this | view in thread ]

  42.  
    identicon
    Anonymous Coward, Feb 17th, 2014 @ 7:16am

    Re: Re: What would happen to TV

    Because they once or twice made the news for refusing to air material they found to be objectionable all of a sudden they are the purveyors of all that is right and just in this universe. Amirite?

     

    reply to this | link to this | view in thread ]

  43.  
    icon
    Marc John Randazza (profile), Feb 17th, 2014 @ 7:18am

    Re: Re: Re: Re: Good arguments, but...

    If you look at the court cases dealing with this, I think you might find yourself to be a little more optimistic. In general, the courts have not been too quick to pull the mask off of a speaker without the right level of proof.

     

    reply to this | link to this | view in thread ]

  44.  
    identicon
    Anonymous Coward, Feb 17th, 2014 @ 7:23am

    Re: Re: Re: Re: Re: Lets not totally close our minds...

    Obviously, the first amendment rights of corporations far out weight those of the individual, unless that individual is rich - then it becomes interesting.

     

    reply to this | link to this | view in thread ]

  45.  
    icon
    PaulT (profile), Feb 18th, 2014 @ 1:00am

    Re: Re: Re: Good arguments, but...

    Not responsibility for the content of the graffiti, no. The owner of a wall is as much the victim of the actions of the third party as the person supposedly affected by the content.

    If the owner of the wall is ordered by a court to remove it, and they fail to comply with that order, then it might be acceptable for them to be charged with violating the court order. But they should NEVER be held responsible for the graffiti itself. The writer of the graffiti should always be the only one responsible, not the easiest to target innocent 3rd party.

     

    reply to this | link to this | view in thread ]

  46.  
    icon
    btrussell (profile), Feb 18th, 2014 @ 4:43am

    Re: Re: Re: Good arguments, but...

    "If he gets a letter from the subject of the graffiti, bringing it to his attention?"

    If it bothered the person that much, I would probably give them permission to clean it off.

    I can't be responsible for what everyone or anyone may find offensive. Maybe the offended party needs to take a little responsibility for their feelings.

     

    reply to this | link to this | view in thread ]

  47.  
    icon
    Marc John Randazza (profile), Feb 18th, 2014 @ 7:47am

    Re: Re: Re: Re: Good arguments, but...

    So you would say liability should attach to the wall owner never?

    Including if the aggrieved party is willing to clean the graffiti off himself, but the wall owner says "nah, I like it there" -- or "well, I might not have written it, but so many people come by to see it, that it has actually improved my property values, so I am leaving it."

    Should the wall owner be automatically or immediately liable? I say no. However, I see no reason why the wall owner should be able to reap the benefits of the content without enjoying the responsibility for it as well. That responsibility should not attach immediately, nor should the system have "clean the wall" as its default. But, if a party has a legitimate legal grievance, is willing to bear the cost of cleaning the wall, and asks the wall owner for permission to do so, then the wall owner should either have to permit it, or make the words his own.

    The reflexive desire to protect Section 230, in its current form, at all costs, borders on irrational.

    Only Sith think in absolutes, PaulT.

     

    reply to this | link to this | view in thread ]

  48.  
    icon
    PaulT (profile), Feb 18th, 2014 @ 8:22am

    Re: Re: Re: Re: Re: Good arguments, but...

    Did you miss the part where I said that the owner would be liable if they refused to remove the graffiti if ordered to by a relevant authority? Apparently so. If the graffiti is offensive enough for a court order, and that's ignored then they're liable for that action (or lack thereof). Up until that point, it's down to discretion of the property owner since such things are extremely subjective and ripe for abuse, especially once we stop talking about physical walls and start looking at unfounded demands from corporations using flawed automated searches.

    If believing that a person who did not write a comment should never be held liable for its content makes me a Sith, then call me Darth Paul. A person should never be held directly liable for the actions of others. However, this does not mean that I excuse the owner from their own actions (e.g. refusing to allow access to remove the graffiti, refusal to comply with a valid court order, etc.).

    This is a very simple concept, not an irrational one. A 3rd party should never be held liable for actions committed by others, only for actions they themselves committed. I'm yet to hear a sane argument as to why they should that doesn't fall into the realm of "but it's too hard to go after the people actually responsible", and that's not an acceptable excuse in my book.

     

    reply to this | link to this | view in thread ]

  49.  
    icon
    Marc John Randazza (profile), Feb 18th, 2014 @ 8:51am

    Re: Re: Re: Re: Re: Re: Good arguments, but...

    Well, PaulT, you might not like a pretty vast body of law in the United States and all other legal systems I've studied. Third party liability, contributory liability, vicarious liability, are all theories that are alive and well. I'd be pleased to direct you to resources on that, if you're interested.

    Further, you seem to be ok with that concept -- you just draw the line at the wall (or website) owner being liable only after a court order. I'm not arguing for liability prior to a court order. I'm arguing that the website owner should get the choice - once put on notice - of either defending the content or complying with a takedown request (with appropriate safeguards and blowback for bogus complaints).

     

    reply to this | link to this | view in thread ]

  50.  
    icon
    PaulT (profile), Feb 19th, 2014 @ 12:40am

    Re: Re: Re: Re: Re: Re: Re: Good arguments, but...

    "I'm arguing that the website owner should get the choice - once put on notice - of either defending the content or complying with a takedown request (with appropriate safeguards and blowback for bogus complaints)"

    ...and that's exactly the point I'm trying to get at. If the content is found by the appropriate authority to require a takedown for whatever reason, it's perfectly acceptable for the site owner to be charged with failing to comply with that legal request. Safeguards to prevent the abuse we've often seen with takedown requests are also a great idea. But, nowhere do I think that the owner should be liable for the content of the message they refuse to take down, only the act of refusing the court order to do so.

    Either we're agreeing and you're arguing semantics, or you're not understanding my basic point.

     

    reply to this | link to this | view in thread ]

  51.  
    identicon
    97pstore, Feb 19th, 2014 @ 12:55am

    I had found a valuable article here and thanks for giving this nice content and also i request you to visit mine also: Online party suppliers

     

    reply to this | link to this | view in thread ]

  52.  
    identicon
    VC lawyer, Feb 25th, 2014 @ 11:07am

    Abused

    What do you think about the first amendment being abused for commercial gain?

    Should they be sued and exposed?

     

    reply to this | link to this | view in thread ]


Add Your Comment

Have a Techdirt Account? Sign in now. Want one? Register here
Get Techdirt’s Daily Email
Save me a cookie
  • Note: A CRLF will be replaced by a break tag (<br>), all other allowable HTML will remain intact
  • Allowed HTML Tags: <b> <i> <a> <em> <br> <strong> <blockquote> <hr> <tt>
Follow Techdirt
A word from our sponsors...
Essential Reading
Techdirt Reading List
Techdirt Insider Chat
A word from our sponsors...
Recent Stories
A word from our sponsors...

Close

Email This