Facebook Declares BBC Article About French Political Polls 'Unsafe'

from the dave,-i-can't-let-you-post-that... dept

Lots of people have reasonable concerns about platforms like Facebook which not only provide an avenue for free expression -- but which also have the power to suddenly decide it won't allow certain forms of expression. Admittedly, there's always a line to be drawn somewhere. People are happy that Facebook tries to keep out spam and scams, but it's still worrying when it seems to want to filter out perfectly legitimate news stories. On Sunday, Nadim Kobeissi tweeted that Facebook wouldn't allow the sharing of a BBC article on the latest political polling in France.
I wasn't sure I believed it so I tried to post that link to my own Facebook page and got a similar message:
Now it's possible that there's a concern over rogue dangerous ads on the BBC site -- though for many people the BBC displays no ads at all. It's also possible that Facebook's algorithms interpret news about the National Front party (which is politely described as "far right," but might more accurately be described as nationalist-to-racist) as somehow dangerous. But, just the fact that Facebook is magically determining that a news story is somehow "unsafe" without giving me any details to understand why or how is tremendously concerning.

And, again, this comes just after we've seen American politicians calling for Facebook and others to magically determine how to block "bad" content that might inspire terrorists. And, it comes just as Google's Eric Schmidt argued that these kinds of filters should be more common. Yet, examples like this show just how problematic the idea of these kinds of filters can be.

The more pressure put on companies like Facebook to do that kind of proactive filtering, the more likely that perfectly legitimate information and news stories like the BBC story here get blocked. And that should be seen as immensely problematic if you believe in free expression and the ability to share ideas freely.
Hide this

Thank you for reading this Techdirt post. With so many things competing for everyone’s attention these days, we really appreciate you giving us your time. We work hard every day to put quality content out there for our community.

Techdirt is one of the few remaining truly independent media outlets. We do not have a giant corporation behind us, and we rely heavily on our community to support us, in an age when advertisers are increasingly uninterested in sponsoring small, independent sites — especially a site like ours that is unwilling to pull punches in its reporting and analysis.

While other websites have resorted to paywalls, registration requirements, and increasingly annoying/intrusive advertising, we have always kept Techdirt open and available to anyone. But in order to continue doing so, we need your support. We offer a variety of ways for our readers to support us, from direct donations to special subscriptions and cool merchandise — and every little bit helps. Thank you.

–The Techdirt Team

Filed Under: content, filters, free speech
Companies: bbc, facebook

Reader Comments

Subscribe: RSS

View by: Time | Thread

  1. icon
    PaulT (profile), 8 Dec 2015 @ 11:35pm

    Re: facebook: rigorous enforcement sometimes, blind eye other times

    "how this label is determined is not only biased, but inconsistent and mysterious"

    Inconsistent, yes, but not really mysterious in my experience. They seem to take action only when they receive a number of complaints so if, say, high profile right-wing hate groups get shut down more than PETA-style groups, that may simply be because people react differently to those kinds of issues (reports vs. ignore).

    "Facebook has also been deleting the accounts of people and organizations that Hollywood objects to. One recent example is the bittorrent site RarBG, which is not only DMCA compliant, but has never posted any links to content on its Facebook page."

    This one?


    or this one?


    If it's an older one, do you have any links to why it may have been removed? If Facebook are complying with DMCA notices themselves, it's not strictly their fault even if the site would prefer they fight on their side to protect them from such notices. I'd hope a DMCA compliant gorup would at least understand what a pain in the arse that would be.

    As for the cheerleader, I'm unfamiliar with the case. But, from your description it sounds like the problem was with not pro-actively disabling the threat page, not that they were wrong in disabling the account with offensive content according to their T&Cs.

    I see your points, but given that some people on Facebook seem to have no problem sharing "this photo was banned, share it to everyone!" type posts that are obviously fake, I take a pinch of salt with this kind of criticism.

Add Your Comment

Have a Techdirt Account? Sign in now. Want one? Register here

Subscribe to the Techdirt Daily newsletter

Comment Options:

  • Use markdown. Use plain text.
  • Make this the First Word or Last Word. No thanks. (get credits or sign in to see balance)    
  • Remember name/email/url (set a cookie)

Follow Techdirt
Special Affiliate Offer

Essential Reading
Techdirt Deals
Report this ad  |  Hide Techdirt ads
Techdirt Insider Chat
Recent Stories

This site, like most other sites on the web, uses cookies. For more information, see our privacy policy. Got it

Email This

This feature is only available to registered users. Register or sign in to use it.