Real Talk About Fake News

from the a-rare-thing dept

At this point, the category of "fake news" has become all but meaningless — a trajectory many of us saw coming the moment we first heard the words or saw the hashtag. That doesn't mean the underlying problems aren't real; many people who talk about "fake news" are trying to express real concern about genuinely troubling trends, but the nebulous label isn't doing them any favors, and is in fact diverting attention from the heart of the issue. With thousands of words a day being expended on the subject with little to no visible progress on understanding it, and companies like Facebook unveiling fact-checking features that may prove to be interesting experiments but are unlikely to make much difference in the long run, it's rare and refreshing to see someone actually get things right. That's why if you're interested in the "fake news" phenomenon, you should read Danah Boyd's new post about the real problems that we can't expect internet platforms to magically address:

I don’t want to let companies off the hook, because they do have a responsibility in this ecosystem. But they’re not going to produce the silver bullet that they’re being asked to produce. And I think that most critics of these companies are really naive if they think that this is an easy problem for them to fix.

Too many people seem to think that you can build a robust program to cleanly define who and what is problematic, implement it, and then presto — problem solved. Yet anyone who has combatted hate and intolerance knows that Band-Aid solutions don’t work. They may make things invisible for a while, but hate will continue to breed unless you address the issues at the source. We need everyone — including companies — to be focused on grappling with the underlying dynamics that are mirrored and magnified by technology.

There’s been a lot of smart writing, both by scholars and journalists, about the intersection of intolerance and fear, inequality, instability, et cetera. The short version of it all is that we have a cultural problem, one that is shaped by disconnects in values, relationships, and social fabric. Our media, our tools, and our politics are being leveraged to help breed polarization by countless actors who can leverage these systems for personal, economic, and ideological gain. Sometimes, it’s for the lulz. Sometimes, the goals are much more disturbing.

That's just one small portion of a piece that is well worth reading in full. Boyd brings some highly relevant experience to the discussion: in the early days of Blogger, she worked for the platform doing all sorts of content moderation work and handling customer complaints, addressing things like online harassment and content policies when those issues were just emerging in a blogging world that was still taking shape. She knows firsthand that it's essentially impossible to draft and enforce a consistent content policy that can't be abused and isn't itself abusive, and it's worrying but not surprising to hear her say that even the experts working on these issues inside social media companies can't stay consistent when describing the problem they want to fix.

Of course, Boyd doesn't claim to have her own silver-bullet solution either, but her proposed approach — designing platforms and mechanisms to encourage the bridging of ideological gaps and world views — is certainly a much smarter and more useful way of thinking about the problem, calling for creative innovation to encourage better speech over the never-ending battle to suppress "bad" speech, even if it's still not immediately clear how it can be put into practice. In any case, we need much more discussion like this in place of people crying "fake news" and assuming everyone else is on board with their own personal, arbitrary definition of those words.

Filed Under: censorship, fake news, filtering, free speech


Reader Comments

Subscribe: RSS

View by: Time | Thread


  1. identicon
    Elain, 31 Mar 2017 @ 10:08am

    Truthy seems to be trying pretty hard to come up with an online tool that can separate truth from trash. I tried out their Hoaxy tool recently and it still needs a bit of work, but the idea is that you type in the key words from any suspicious 'news' piece you see doing the rounds, and then it spits out a visual graph showing what accounts and websites are associated with it. The only downside is that it requires you to have a basic awareness of what dodgy social media users look like/how they behave, and which websites are known for fudging facts (Breitbart for example).

    So like I said, it could use some work, but they do seem to be forging ahead in the right direction. For anyone who is interested, their site is located at: http://truthy.indiana.edu/tools/

    Also, EUvsDisinfo.com is a site that tracks unproven or disproven 'news' pieces appearing in the European media, with a specific focus on disinformation coming out of Russia: https://euvsdisinfo.eu/

    I'd be interested to see if anyone else has come across any similar websites or tools that can help us discern truth from trash.

    I'd tend to disagree that fakenews is an ineffectual label. As a journalist myself, I know that the media gets away quite a bit of conscious manipulation of information, but that manipulation is at least subject to regulations and public bodies which enforce them, so it's limited. Fakenews on the other hand, is straight up falsification. The constant stream of stories that pour out of the altright, claiming that every dark skinned person who is caught in a criminal act is a "Muslim refugee" even though 99% of the time they are neither, is a perfect example of what I'm talking about. Yes, all manipulation of facts is wrong and most serious journalists deplore it, but what the bona fide fakenews outlets are doing to whip up hatred against minorities is truly alarming... more so because no one can regulate it, which means it's allowed to seep into the global consciousness and fester unchecked.

Add Your Comment

Have a Techdirt Account? Sign in now. Want one? Register here



Subscribe to the Techdirt Daily newsletter




Comment Options:

  • Use markdown. Use plain text.
  • Remember name/email/url (set a cookie)

Follow Techdirt
Techdirt Gear
Shop Now: Copying Is Not Theft
Advertisement
Report this ad  |  Hide Techdirt ads
Essential Reading
Techdirt Deals
Report this ad  |  Hide Techdirt ads
Techdirt Insider Chat
Advertisement
Report this ad  |  Hide Techdirt ads
Recent Stories
Advertisement
Report this ad  |  Hide Techdirt ads

Close

Email This

This feature is only available to registered users. Register or sign in to use it.