US Reporter Ronan Farrow Calls On Internet Companies To Censor Speech Of People He Doesn't Like
from the fascinating dept
The Sunni Islamic State insurgents, now locked in a deadly struggle with Iraq’s Shiite majority, excel online. They command a plethora of official and unofficial channels on Facebook, Twitter, and YouTube. “And kill them wherever you find them,” commands one recent propaganda reel of firefights and bound hostages, contorting a passage from the Koran. “Take up arms, take up arms, O soldiers of the Islamic State. And fight, fight!” adds another, featuring a sermon from the group’s leader, Abu Bakr al-Baghdadi. The material is often slickly produced, like “The Clanging of Swords IV,” a glossy, feature-length film replete with slow-motion action scenes. Much of it is available in English, directly targeting the recruits with Western passports that have become one of the organization’s more dangerous assets. And almost all of it appeals to the young: Photoshops of Islamic State fighters and their grizzly massacres with video game-savvy captions like, “This is our Call of Duty.”Of course, what Farrow ignores is that it's not at all difficult to find Americans using social media for similar calls to action. For example, how about a Fox News contributor announcing that it was time to "Muslims are evil. Let's kill them all." Or a Breitbart News contributor calling for people to "start slaughtering Muslims in the streets, all of them."
But officials at social media companies are leery of adjudicating what should be taken down and what should be left alone. “One person’s terrorist is another person’s freedom fighter,” one senior executive tells me on condition of anonymity. Making that call is “not something we’d want to do.”
I find both of those statements abhorrent, but the point is that idiots will make stupid incendiary statements on Twitter, Facebook and YouTube all the time -- and most people look at them and realize that they're ignorant crazy people talking. No one is actually incentivized to run out and actually follow those arguments. Yet Farrow seems to think that the people who follow those other groups on social media immediately accept what is said and follow through?
Just because people are saying stupid stuff on social media, doesn't mean internet companies should step in and decide what is and what is not appropriate. Where do you draw the line? Farrow breezily admits that it may be difficult to figure out what to take down and what to leave up, but... then just assumes it's kind of easy anyway... because child porn.
More troubling still is the fact that these companies already know how to police and remove content that violates other laws. Every major social media network employs algorithms that automatically detect and prevent the posting of child pornography. Many, including YouTube, use a similar technique to prevent copyrighted material from hitting the web. Why not, in those overt cases of beheading videos and calls for blood, employ a similar system?See how limited types of censorship almost always lead to calls for greater and greater censorship> It's fairly amazing that an attorney, former State Department official and a reporter would so blatantly call for censorship, but that appears to be Farrow's bag. Besides, he's apparently rather clueless about why his call for censoring "terrorists" is so different from child porn (an absolute liability situation, where it's generally immediately obvious if something is illegal) and copyright (where the system is already quite problematic, and involves a detailed notice-and-takedown process that has massive dangerous unintended consequences). He also ignores the fact that all of these companies already do pull down extremist content (something many folks think already goes too far). Apparently, Farrow's not big on details.
Farrow does mention Section 230 of the CDA, but apparently is ignorant of how that law actually works as well:
As always, beneath legitimate practical and ethical concerns, there is a question about the bottom line. Section 230 of the Telecom Act of 1996 inoculates these companies from responsibility for content that users post—as long as they don’t know about it. Individuals involved in content removal policies at the major social media companies, speaking to me on condition of anonymity, say that’s a driving factor in their thinking. “We can’t police any content ourselves,” one explains. Adds another: “The second we get into reviewing any content ourselves, record labels say, ‘You should be reviewing all videos for copyright violations, too.’”First of all, this is wrong. The "as long as they don't know about it" is flat out wrong. Section 230 actually is explicit that if you do know about it, it's entirely the company's discretion whether or not to remove. If they do, that imposes no additional obligations on them to remove other content. However, the final comment is more accurate -- though, amusingly, it contradicts Farrow's own earlier statement about how these companies already know how to stop copyright-covered content from appearing.
The point is that determining who is and who is not a "terrorist" isn't so easy, and that slope is very slippery. Should those Fox News and Breitbart contributors be cut off as well for their "terroristic" threats? Remember that after then-Senator Joe Lieberman went on a similar crusade to get YouTube to take down "terrorist" videos, it resulted in YouTube disabling the YouTube channel of an important Syrian watchdog group that had been unveiling atrocities in that country.
Farrow keeps going back to the genocide in Rwanda to prove his point. But under his logic, anyone documenting that genocide and getting the news out to the world would likely be censored, allowing that kind of genocide to go on.
Yes, if you think simplistically about things, it must seem so easy to just say, "Well, censor the bad guys." But you'd think that someone with Farrow's training and background would actually know that simplistic solutions to challenging and nuanced questions often result in very dangerous policies with serious unintended consequences.