NY Times Columnist Nick Kristof Led The Charge To Get Facebook To Censor Content, Now Whining That Facebook Censors His Content

from the karma-nick,-karma dept

We’ve talked in the past about NY Times columnist Nick Kristof, who is a bit infamous for having something of a savior complex in his views. He is especially big on moral panics around sex trafficking, and was one of the most vocal proponents of FOSTA, despite not understanding what the law would do at all (spoiler alert: just as we predicted, and as Kristof insisted would not happen — FOSTA has put more women at risk). When pushing for FOSTA, Kristof wrote the following:

Even if Google were right that ending the immunity for Backpage might lead to an occasional frivolous lawsuit, life requires some balancing.

For example, websites must try to remove copyrighted material if it?s posted on their sites. That?s a constraint on internet freedom that makes sense, and it hasn?t proved a slippery slope. If we?re willing to protect copyrights, shouldn?t we do as much to protect children sold for sex?

As we noted at the time, this was an astoundingly ignorant thing to say, but of course now that Kristof helped get the law passed and put many more lives at risk, the “meh, no big deal if there are some more lawsuits or more censorship” attitude seems to be coming back to bite him.

You see, last week, Kristof weighed in on US policy in Yemen. The core of his argument was to discuss the horrific situation of Abrar Ibrahim, a 12-year-old girl who is starving in Yemen, and weighs just 28 pounds. There’s a giant photo of the emaciated Ibrahim atop the article, wearing just a diaper. It packs an emotional punch, just as intended.

But, it turns out that Facebook is blocking that photo of Ibrahim, claiming it is “nudity and sexual content.” And, boy, is Kristof mad about it:

Hey, Nick, you were the one who insisted that Facebook and others in Silicon Valley needed to ban “sexual content” or face criminal liability. You were the one who insisted that any collateral damage would be minor. You were the one who said there was no slippery slope.

Yet, here is a perfect example of why clueless saviors like Kristof always make things worse, freaking out about something they don’t understand, prescribing the exact wrong solution. Moderating billions of pieces of content leads to lots of mistakes. The only way you can do it is to set rules. Thanks to laws like FOSTA — again, passed at Kristof’s direct urging — Facebook has rules about nudity that include no female nudity/nipples. This rule made a lot of news two years ago when Facebook banned an iconic photo from the Vietnam War showing a young, naked, girl fleeing a napalm attack. Facebook eventually created a “newsworthy” exception to the rule, but that depends on the thousands of “content moderators” viewing this content knowing that this particular photo is newsworthy.

And, thanks to FOSTA, the cost of making a mistake is ridiculously high (possible criminal penalties), and thus, the only sane thing for a company like Facebook to do is to take that content down and block it. That’s exactly what Nick Kristof wanted. But now he’s whining because the collateral damage he shrugged off a year ago is himself. Yeah, maybe next time Nick should think about that before shrugging off what every single internet expert tried to explain to him at the time.

But hey, Nick, as someone once said, maybe the law you pushed for leads to an occasional frivolous takedown of important content about the impact of US policy on an entire population, but “life requires some balancing.” Oh well.

Filed Under: , , ,
Companies: facebook

Rate this comment as insightful
Rate this comment as funny
You have rated this comment as insightful
You have rated this comment as funny
Flag this comment as abusive/trolling/spam
You have flagged this comment
The first word has already been claimed
The last word has already been claimed
Insightful Lightbulb icon Funny Laughing icon Abusive/trolling/spam Flag icon Insightful badge Lightbulb icon Funny badge Laughing icon Comments icon

Comments on “NY Times Columnist Nick Kristof Led The Charge To Get Facebook To Censor Content, Now Whining That Facebook Censors His Content”

Subscribe: RSS Leave a comment
30 Comments
Anonymous Coward says:

Oh, man. Love this Kristof quote:

For example, websites must try to remove copyrighted material if it’s posted on their sites. That’s a constraint on internet freedom that makes sense, and it hasn’t proved a slippery slope. If we’re willing to protect copyrights, shouldn’t we do as much to protect children sold for sex?

In other words, it hasn’t proved a slippery slope. But might I suggest one?

Bruce C. says:

Exploitation...

If Kristof wants to avoid running afoul of legally mandated sexual content filters, he should stop trafficking in partially nude photos of children to sell newspapers. In fact, maybe Facebook should report him, the Times and the photographer as potential child sex-traffickers. If you suspect child porn, you’re required to report it, right? And since the filter flagged it, it clearly must be porn.

We have met the enemy and he is us.

John Warr says:

Re: Exploitation...

Except that this isn’t a case of “legally mandated sexual content filters” Facebook have being doing this well before FOSTA they delete breastfeeding images, and the photo of the vietnam girl running away from the village, etc, etc. I believe they’ve deleted images of Michaelangelos’s David in the past too.

Has nothing to do with FOSTA and everything to do with pandering to a FB flash mobs.

Anonymous Coward says:

It would indeed be funny if it ever turned out that Nick Kristof is a sexual predator. Not that this would be something unusual, as notable womens-rights “progressives” like Bill Cosby and Harvey Weinstein turned out to be doing naughty things behind closed doors that defied their benevolent public image. Even vociferous crusaders against sexual evils, such as a certain “wide stance” senator and numerous religious leaders, have turned out to be complete hypocrites.

While porn used to be considered the worst type of content, and historically was always first on the list to get banned, that might no longer be the case, at least according to the way internet activists work to censor content they don’t like. A small crowdfunding site like Subscribestar that caters to “adult” content can remain online for a year without anyone complaining, and then suddenly gets pink-slipped by all its payment processors and is forced to close down within days of a non-porn gamergate blogger (and numerous allies in solidarity) joining the site, due to nothing more than false accusations.

https://archive.fo/ED1Qh

As the continued attacks against ‘free speech’ rage on, FaceBook is far from the worst culprit to give in to the demands of the intolerant pro-censorship mob who increasingly engage in scorched earth tactics.

Hugo S Cunningham (profile) says:

A solution Kristof would like...

[sarc] Large corporations like the “New York Times” (licensed press outlets?) could be trusted with the responsibility to monitor themselves, and exempted from draconian civil and criminal penalties unless deliberate criminal intent is shown (highly unlikely). For the occasional mistake that lets something bad through, they could pay a reasonable fine, reflecting the fact that they are responsible and licensed, unlike “cowboy” bloggers and small independent publishers.[end sarc]

ryuugami says:

Re: A solution Kristof would like...

Um. You put that in "sarc" marks, but it’s exactly what Kristof suggested in the tweet that the AC linked to upthread:

Thanks for fixing. For the future, can I suggest a presumption by Facebook that articles with a nytimes or washpost or npr domain are legitimate content and are not pornographic?

Well, except for the "pay a reasonable fine if a mistake happens" provision. That’d be going a step too far.

bob says:

Re: Re: A solution Kristof would like...

Except why should we assume they will never put pornographic or otherwise “unsavory” things in their articles. They are news organizations not gods. They will make mistakes. And when they do who will pickup the fines for posting illegal content?

Seems like Facebook should just continue to block everything by default for those companies just like everyone else.

PaulT (profile) says:

Re: Re: Re: A solution Kristof would like...

Hell, even if they don’t make mistakes, hacks and defacing are a thing. Facebook allows linked or embedded content, then NYT’s servers get hacked or their DNS is spoofed, and they end up pointing to porn anyway? Chances are people who don’t understand the internet will still try and hold them responsible.

That’s the entire point of safe harbours and other protections – if Facebook are allowing user generated content, they have no direct control and should not be treated as directly culpable. He demanded that they be held liable anyway, so he gets to deal with the result of that. Hopefully he bears this in mind next time someone informs him about unintended consequences.

John Cressman says:

When they came for...

Time to update the old saying…

When Facebook came for the conservatives, I didn’t say anything because I wasn’t a conservative…

When Facebook came for the Non-Politically Correct crowd, I didn’t say anything because I was politically correct…

When Facebook came for me, there was no one left to say anything…

Valkor says:

Re: Re: When they came for...

Ok, fine. First, Facebook came for the Nazis, but I wasn’t a Nazi.

Irony can be thought provoking. I think of it like a stinky cheese that tastes good. They both can evoke a negative initial reaction, but improve upon careful consideration.

Now, clearly, the Facebook version is not nearly as serious as the actions of an actual government. We’re nowhere near a Fourth Reich. If Facebook gets too oppressive, they will merely destroy their own business eventually. But… What happens if that censorship breaks containment? What happens if we get conditioned to expect our gatekeepers to protect us from whatever it is we don’t like? There are already plenty of people who think the government should do just that, regardless of Facebook’s policies. When Tech fails, do we want people turning to the government for all their censorship needs?

Facebook wanted to be all things to all people. Now, it looks like it wants to be the communication platform for everyone, but only the communication it wants. Well, guess what. When your “community” is everybody, you don’t have “community standards” anymore. You can be niche, and expect to cultivate a community that at least agrees on ground rules, or you can be ubiquitous and take all the bad with the good. Facebook has no business moderating beyond things that are actually illegal, and that would be enough to keep a healthy debate going. If Facebook wants to be the middleman for everyone’s news and information, they don’t have any business editing that information.

Facebook, I hope you bleed to death from your own self harm.

PaulT (profile) says:

Re: Re: Re: When they came for...

“Ok, fine. First, Facebook came for the Nazis, but I wasn’t a Nazi.”

So, you are defending the right of Nazis to pursue their aims of genocide?

“Facebook has no business moderating beyond things that are actually illegal”

They can do whatever the hell they want, actually, unless they violate some law themselves by doing so. For example, they can tell white supremacists and literally Nazis to fuck off their property, but they can’t say the same to black people or Jews just because they don’t like their race.

Valkor says:

Re: Re: Re:2 When they came for...

So, you are defending the right of Nazis to pursue their aims of genocide?

Not remotely. That sounds like a criminal conspiracy. I do, however, support their right to talk about it and let them display their ignorance, their lunacy, and their general assholery.

They can do whatever the hell they want, actually, unless they violate some law themselves by doing so.

Of course they can. I was trying to say that, because Facebook is so damn big, it would be wise of Facebook to tolerate bad ideas on their platform so that others can more effectively use their platform to counter with good ideas. Maybe I’m idealistic in thinking that good information drives out bad information.

Mason Wheeler had an interesting comment that showed up in the Techdirt Insider Chat sidebar about how censorship of Hitler actually gave him credibility to a certain group. I’m all for denying credibility to his emotionally damaged acolytes.

PaulT (profile) says:

Re: Re: Re:3 When they came for...

“I do, however, support their right to talk about it and let them display their ignorance, their lunacy, and their general assholery.”

As do I. However, I also support the right of Facebook to moderate their own platform so that the majority of right-minded people who use their platform don’t have to read that shit. If a drunk asshole is shouting and trying to start fights in a bar, the bar owner is not in the wrong to kick him out into the street. Let him find another bar, or set up his own, if he really wants do that stuff.

“Maybe I’m idealistic in thinking that good information drives out bad information”

Yeah, that worked perfectly in 1930s Germany, didn’t it? Heather Heyer was murdered at a time when their crap was being tolerated on those platforms. No, drive them out into whichever cesspool they wish to retreat to and keep a close eye on them. Their actions have consequences, and one of those consequences is people telling them to get the fuck out of their house.

Anonymous Coward says:

Re: Re: When they came for...

We should keep in mind that Martin Niemöller, the author of “First They Came For …” was a communist-hating Hitler supporter — so basically, yes, a “Nazi” — who had no problem with the Nazi roundup of communists, and didn’t complain much as they slowly worked their way up the ladder of “undesirables” … until he himself was eventually imprisoned.

But apparently the people who today eagerly support “de-platforming” of people they don’t like do so with the naive certainty that the chopping block will never come to them. History of course demonstrates otherwise.

PaulT (profile) says:

Re: Re: Re: When they came for...

“But apparently the people who today eagerly support “de-platforming” of people they don’t like do so with the naive certainty that the chopping block will never come to them”

So… you’re also so moronic that you think that a private company refusing service is the same as rounding up and murdering people? Wow.

This is why the idiotic analogy fails. If I got “de platformed” by Facebook for my beliefs, I’d just go somewhere else – or easily set up my own platform if none of the existing ones want me for commercial reasons. That’s somewhat different from being kidnapped from my own home and murdered. I hope you’re just being hyperbolic and not actually thinking the things are even remotely similar.

Add Your Comment

Your email address will not be published. Required fields are marked *

Have a Techdirt Account? Sign in now. Want one? Register here

Comment Options:

Make this the or (get credits or sign in to see balance) what's this?

What's this?

Techdirt community members with Techdirt Credits can spotlight a comment as either the "First Word" or "Last Word" on a particular comment thread. Credits can be purchased at the Techdirt Insider Shop »

Follow Techdirt

Techdirt Daily Newsletter

Ctrl-Alt-Speech

A weekly news podcast from
Mike Masnick & Ben Whitelaw

Subscribe now to Ctrl-Alt-Speech »
Techdirt Deals
Techdirt Insider Discord
The latest chatter on the Techdirt Insider Discord channel...
Loading...