NY Times Columnist Nick Kristof Led The Charge To Get Facebook To Censor Content, Now Whining That Facebook Censors His Content

from the karma-nick,-karma dept

We've talked in the past about NY Times columnist Nick Kristof, who is a bit infamous for having something of a savior complex in his views. He is especially big on moral panics around sex trafficking, and was one of the most vocal proponents of FOSTA, despite not understanding what the law would do at all (spoiler alert: just as we predicted, and as Kristof insisted would not happen -- FOSTA has put more women at risk). When pushing for FOSTA, Kristof wrote the following:

Even if Google were right that ending the immunity for Backpage might lead to an occasional frivolous lawsuit, life requires some balancing.

For example, websites must try to remove copyrighted material if it’s posted on their sites. That’s a constraint on internet freedom that makes sense, and it hasn’t proved a slippery slope. If we’re willing to protect copyrights, shouldn’t we do as much to protect children sold for sex?

As we noted at the time, this was an astoundingly ignorant thing to say, but of course now that Kristof helped get the law passed and put many more lives at risk, the "meh, no big deal if there are some more lawsuits or more censorship" attitude seems to be coming back to bite him.

You see, last week, Kristof weighed in on US policy in Yemen. The core of his argument was to discuss the horrific situation of Abrar Ibrahim, a 12-year-old girl who is starving in Yemen, and weighs just 28 pounds. There's a giant photo of the emaciated Ibrahim atop the article, wearing just a diaper. It packs an emotional punch, just as intended.

But, it turns out that Facebook is blocking that photo of Ibrahim, claiming it is "nudity and sexual content." And, boy, is Kristof mad about it:

Hey, Nick, you were the one who insisted that Facebook and others in Silicon Valley needed to ban "sexual content" or face criminal liability. You were the one who insisted that any collateral damage would be minor. You were the one who said there was no slippery slope.

Yet, here is a perfect example of why clueless saviors like Kristof always make things worse, freaking out about something they don't understand, prescribing the exact wrong solution. Moderating billions of pieces of content leads to lots of mistakes. The only way you can do it is to set rules. Thanks to laws like FOSTA -- again, passed at Kristof's direct urging -- Facebook has rules about nudity that include no female nudity/nipples. This rule made a lot of news two years ago when Facebook banned an iconic photo from the Vietnam War showing a young, naked, girl fleeing a napalm attack. Facebook eventually created a "newsworthy" exception to the rule, but that depends on the thousands of "content moderators" viewing this content knowing that this particular photo is newsworthy.

And, thanks to FOSTA, the cost of making a mistake is ridiculously high (possible criminal penalties), and thus, the only sane thing for a company like Facebook to do is to take that content down and block it. That's exactly what Nick Kristof wanted. But now he's whining because the collateral damage he shrugged off a year ago is himself. Yeah, maybe next time Nick should think about that before shrugging off what every single internet expert tried to explain to him at the time.

But hey, Nick, as someone once said, maybe the law you pushed for leads to an occasional frivolous takedown of important content about the impact of US policy on an entire population, but "life requires some balancing." Oh well.


Reader Comments

The First Word

Subscribe: RSS

View by: Time | Thread


  • identicon
    Anonymous Coward, 17 Dec 2018 @ 12:17pm

    Oh, man. Love this Kristof quote:

    For example, websites must try to remove copyrighted material if it’s posted on their sites. That’s a constraint on internet freedom that makes sense, and it hasn’t proved a slippery slope. If we’re willing to protect copyrights, shouldn’t we do as much to protect children sold for sex?

    In other words, it hasn't proved a slippery slope. But might I suggest one?

    reply to this | link to this | view in chronology ]

    • identicon
      Anonymous Coward, 17 Dec 2018 @ 12:42pm

      Re:

      websites must try to remove copyrighted material

      I see a copyright notice on his opinion columns, and no obvious waiver of copyright on his Twitter messages...

      reply to this | link to this | view in chronology ]

  • icon
    Gary (profile), 17 Dec 2018 @ 12:30pm

    Easy fix

    Hey - Filtering is such a simple fix! Unless you have to do it.

    reply to this | link to this | view in chronology ]

  • identicon
    Bruce C., 17 Dec 2018 @ 12:35pm

    Exploitation...

    If Kristof wants to avoid running afoul of legally mandated sexual content filters, he should stop trafficking in partially nude photos of children to sell newspapers. In fact, maybe Facebook should report him, the Times and the photographer as potential child sex-traffickers. If you suspect child porn, you're required to report it, right? And since the filter flagged it, it clearly must be porn.

    We have met the enemy and he is us.

    reply to this | link to this | view in chronology ]

    • identicon
      John Warr, 25 Dec 2018 @ 10:59am

      Re: Exploitation...

      Except that this isn't a case of "legally mandated sexual content filters" Facebook have being doing this well before FOSTA they delete breastfeeding images, and the photo of the vietnam girl running away from the village, etc, etc. I believe they've deleted images of Michaelangelos's David in the past too.

      Has nothing to do with FOSTA and everything to do with pandering to a FB flash mobs.

      reply to this | link to this | view in chronology ]

  • identicon
    Anonymous Coward, 17 Dec 2018 @ 12:51pm

    Gotta love that:

    https://twitter.com/NickKristof/status/1074753415241183234

    Of course, only the "real" journalists should be excluded from filters... not all those little guys.

    reply to this | link to this | view in chronology ]

  • icon
    That One Guy (profile), 17 Dec 2018 @ 1:26pm

    Oh sweet schadenfreude...

    "Sacrifices must be made for the greater good!"

    Several months later

    "I didn't mean I should be the one making them!"

    reply to this | link to this | view in chronology ]

  • identicon
    Anonymous Coward, 17 Dec 2018 @ 1:34pm

    It would indeed be funny if it ever turned out that Nick Kristof is a sexual predator. Not that this would be something unusual, as notable womens-rights "progressives" like Bill Cosby and Harvey Weinstein turned out to be doing naughty things behind closed doors that defied their benevolent public image. Even vociferous crusaders against sexual evils, such as a certain "wide stance" senator and numerous religious leaders, have turned out to be complete hypocrites.

    While porn used to be considered the worst type of content, and historically was always first on the list to get banned, that might no longer be the case, at least according to the way internet activists work to censor content they don't like. A small crowdfunding site like Subscribestar that caters to "adult" content can remain online for a year without anyone complaining, and then suddenly gets pink-slipped by all its payment processors and is forced to close down within days of a non-porn gamergate blogger (and numerous allies in solidarity) joining the site, due to nothing more than false accusations.

    https://archive.fo/ED1Qh

    As the continued attacks against 'free speech' rage on, FaceBook is far from the worst culprit to give in to the demands of the intolerant pro-censorship mob who increasingly engage in scorched earth tactics.

    reply to this | link to this | view in chronology ]

  • icon
    Hugo S Cunningham (profile), 17 Dec 2018 @ 4:40pm

    A solution Kristof would like...

    [sarc] Large corporations like the "New York Times" (licensed press outlets?) could be trusted with the responsibility to monitor themselves, and exempted from draconian civil and criminal penalties unless deliberate criminal intent is shown (highly unlikely). For the occasional mistake that lets something bad through, they could pay a reasonable fine, reflecting the fact that they are responsible and licensed, unlike "cowboy" bloggers and small independent publishers.[end sarc]

    reply to this | link to this | view in chronology ]

    • identicon
      ryuugami, 17 Dec 2018 @ 7:20pm

      Re: A solution Kristof would like...

      Um. You put that in "sarc" marks, but it's exactly what Kristof suggested in the tweet that the AC linked to upthread:

      Thanks for fixing. For the future, can I suggest a presumption by Facebook that articles with a nytimes or washpost or npr domain are legitimate content and are not pornographic?

      Well, except for the "pay a reasonable fine if a mistake happens" provision. That'd be going a step too far.

      reply to this | link to this | view in chronology ]

      • identicon
        bob, 18 Dec 2018 @ 12:27am

        Re: Re: A solution Kristof would like...

        Except why should we assume they will never put pornographic or otherwise "unsavory" things in their articles. They are news organizations not gods. They will make mistakes. And when they do who will pickup the fines for posting illegal content?

        Seems like Facebook should just continue to block everything by default for those companies just like everyone else.

        reply to this | link to this | view in chronology ]

        • icon
          PaulT (profile), 18 Dec 2018 @ 1:33am

          Re: Re: Re: A solution Kristof would like...

          Hell, even if they don't make mistakes, hacks and defacing are a thing. Facebook allows linked or embedded content, then NYT's servers get hacked or their DNS is spoofed, and they end up pointing to porn anyway? Chances are people who don't understand the internet will still try and hold them responsible.

          That's the entire point of safe harbours and other protections - if Facebook are allowing user generated content, they have no direct control and should not be treated as directly culpable. He demanded that they be held liable anyway, so he gets to deal with the result of that. Hopefully he bears this in mind next time someone informs him about unintended consequences.

          reply to this | link to this | view in chronology ]

  • identicon
    Anonymous Coward, 17 Dec 2018 @ 6:38pm

    NYT, the hidden government's mouthpiece. One day our two governments will clash, then burn baby burn.

    reply to this | link to this | view in chronology ]

  • icon
    That Anonymous Coward (profile), 17 Dec 2018 @ 11:08pm

    Enjoy your bed, we added the 10,000 bedbugs like you demanded everyone else have to deal with... oh thats bad now?
    Oh well.

    reply to this | link to this | view in chronology ]

  • identicon
    fsa9, 18 Dec 2018 @ 4:03am

    yea, fuck nick

    reply to this | link to this | view in chronology ]

  • identicon
    John Cressman, 18 Dec 2018 @ 9:08am

    When they came for...

    Time to update the old saying...

    When Facebook came for the conservatives, I didn't say anything because I wasn't a conservative...

    When Facebook came for the Non-Politically Correct crowd, I didn't say anything because I was politically correct...

    When Facebook came for me, there was no one left to say anything...

    reply to this | link to this | view in chronology ]

    • icon
      PaulT (profile), 18 Dec 2018 @ 9:30am

      Re: When they came for...

      I'm not sure what's worse, utilising a poem about the Holocaust to complain about some people not being able to use private property without consequences, or the fact that among the people first targeted by Facebook's policies were actual Nazis and you don't see the irony.

      reply to this | link to this | view in chronology ]

      • identicon
        Valkor, 18 Dec 2018 @ 10:50am

        Re: Re: When they came for...

        Ok, fine. First, Facebook came for the Nazis, but I wasn't a Nazi.

        Irony can be thought provoking. I think of it like a stinky cheese that tastes good. They both can evoke a negative initial reaction, but improve upon careful consideration.

        Now, clearly, the Facebook version is not nearly as serious as the actions of an actual government. We're nowhere near a Fourth Reich. If Facebook gets too oppressive, they will merely destroy their own business eventually. But... What happens if that censorship breaks containment? What happens if we get conditioned to expect our gatekeepers to protect us from whatever it is we don't like? There are already plenty of people who think the government should do just that, regardless of Facebook's policies. When Tech fails, do we want people turning to the government for all their censorship needs?

        Facebook wanted to be all things to all people. Now, it looks like it wants to be the communication platform for everyone, but only the communication it wants. Well, guess what. When your "community" is everybody, you don't have "community standards" anymore. You can be niche, and expect to cultivate a community that at least agrees on ground rules, or you can be ubiquitous and take all the bad with the good. Facebook has no business moderating beyond things that are actually illegal, and that would be enough to keep a healthy debate going. If Facebook wants to be the middleman for everyone's news and information, they don't have any business editing that information.

        Facebook, I hope you bleed to death from your own self harm.

        reply to this | link to this | view in chronology ]

        • identicon
          Anomalous Cowherder, 18 Dec 2018 @ 11:28am

          Re: Re: Re: When they came for...

          Ok, fine. First, Facebook came for the Nazis, but I wasn't a Nazi.

          You are now. Nazi.

          Don't sealion about this. That proves you are a troll.

          reply to this | link to this | view in chronology ]

        • icon
          PaulT (profile), 19 Dec 2018 @ 12:49am

          Re: Re: Re: When they came for...

          "Ok, fine. First, Facebook came for the Nazis, but I wasn't a Nazi."

          So, you are defending the right of Nazis to pursue their aims of genocide?

          "Facebook has no business moderating beyond things that are actually illegal"

          They can do whatever the hell they want, actually, unless they violate some law themselves by doing so. For example, they can tell white supremacists and literally Nazis to fuck off their property, but they can't say the same to black people or Jews just because they don't like their race.

          reply to this | link to this | view in chronology ]

          • identicon
            Valkor, 19 Dec 2018 @ 1:16pm

            Re: Re: Re: Re: When they came for...

            So, you are defending the right of Nazis to pursue their aims of genocide?

            Not remotely. That sounds like a criminal conspiracy. I do, however, support their right to talk about it and let them display their ignorance, their lunacy, and their general assholery.

            They can do whatever the hell they want, actually, unless they violate some law themselves by doing so.

            Of course they can. I was trying to say that, because Facebook is so damn big, it would be wise of Facebook to tolerate bad ideas on their platform so that others can more effectively use their platform to counter with good ideas. Maybe I'm idealistic in thinking that good information drives out bad information.

            Mason Wheeler had an interesting comment that showed up in the Techdirt Insider Chat sidebar about how censorship of Hitler actually gave him credibility to a certain group. I'm all for denying credibility to his emotionally damaged acolytes.

            reply to this | link to this | view in chronology ]

            • icon
              PaulT (profile), 20 Dec 2018 @ 12:49am

              Re: Re: Re: Re: Re: When they came for...

              "I do, however, support their right to talk about it and let them display their ignorance, their lunacy, and their general assholery."

              As do I. However, I also support the right of Facebook to moderate their own platform so that the majority of right-minded people who use their platform don't have to read that shit. If a drunk asshole is shouting and trying to start fights in a bar, the bar owner is not in the wrong to kick him out into the street. Let him find another bar, or set up his own, if he really wants do that stuff.

              "Maybe I'm idealistic in thinking that good information drives out bad information"

              Yeah, that worked perfectly in 1930s Germany, didn't it? Heather Heyer was murdered at a time when their crap *was* being tolerated on those platforms. No, drive them out into whichever cesspool they wish to retreat to and keep a close eye on them. Their actions have consequences, and one of those consequences is people telling them to get the fuck out of their house.

              reply to this | link to this | view in chronology ]

      • identicon
        Anonymous Coward, 18 Dec 2018 @ 12:04pm

        Re: Re: When they came for...

        We should keep in mind that Martin Niemöller, the author of "First They Came For ..." was a communist-hating Hitler supporter -- so basically, yes, a "Nazi" -- who had no problem with the Nazi roundup of communists, and didn't complain much as they slowly worked their way up the ladder of "undesirables" ... until he himself was eventually imprisoned.

        But apparently the people who today eagerly support "de-platforming" of people they don't like do so with the naive certainty that the chopping block will never come to them. History of course demonstrates otherwise.

        reply to this | link to this | view in chronology ]

        • icon
          PaulT (profile), 19 Dec 2018 @ 12:52am

          Re: Re: Re: When they came for...

          "But apparently the people who today eagerly support "de-platforming" of people they don't like do so with the naive certainty that the chopping block will never come to them"

          So... you're also so moronic that you think that a private company refusing service is the same as rounding up and murdering people? Wow.

          This is why the idiotic analogy fails. If I got "de platformed" by Facebook for my beliefs, I'd just go somewhere else - or easily set up my own platform if none of the existing ones want me for commercial reasons. That's somewhat different from being kidnapped from my own home and murdered. I hope you're just being hyperbolic and not actually thinking the things are even remotely similar.

          reply to this | link to this | view in chronology ]

  • icon
    Hugo S Cunningham (profile), 18 Dec 2018 @ 10:18am

    Hey, Troll--

    The original post and comments were not about copyright claims ("use private property without consequences"). It was about FOSTA-- shutting down anything potentially related to sex imagery or work.

    reply to this | link to this | view in chronology ]


Add Your Comment

Have a Techdirt Account? Sign in now. Want one? Register here
Get Techdirt’s Daily Email
Use markdown for basic formatting. HTML is no longer supported.
  Save me a cookie
Close
Have a Techdirt Account? Sign in now. Want one? Register here
Get Techdirt’s Daily Email
Use markdown for basic formatting. HTML is no longer supported.
  Save me a cookie
Follow Techdirt
Special Affiliate Offer

Advertisement
Report this ad  |  Hide Techdirt ads
Essential Reading
Techdirt Deals
Report this ad  |  Hide Techdirt ads
Techdirt Insider Chat
Advertisement
Report this ad  |  Hide Techdirt ads
Recent Stories
Advertisement
Report this ad  |  Hide Techdirt ads

Close

Email This

This feature is only available to registered users. Register or sign in to use it.