EU Commission Says Social Media Companies Must Take Down 'Terrorist Content' Within One Hour

from the plus-more-internet-hobbling-guidelines dept

Once social media companies and websites began acquiescing to EU Commission demands for content takedown, the end result was obvious. Whatever was already in place would continually be ratcheted up. And every time companies failed to do the impossible, the EU Commission would appear on their virtual doorsteps, demanding they be faster and more proactive.

Facebook, Twitter, Google, and Microsoft all agreed to remove hate speech and other targeted content within 24 hours, following a long bitching session from EU regulators about how long it took these companies to comply with takedown orders. As Tim Geigner pointed out late last year, the only thing tech companies gained from this acquiescence was a reason to engage in proactive censorship.

Because if a week or so, often less, isn't enough, what will be? You can bet that if these sites got it down to 3 days, the EU would demand it be done in 2. If 2, then 1. If 1? Well, then perhaps internet companies should become proficient in censoring speech the EU doesn't like before it ever appears.

Even proactive censorship isn't enough for the EU Commission. It has released a new set of recommendations [PDF] for social media companies that sharply increases mandated response time. The Commission believes so-called "terrorist" content should be so easy to spot, companies will have no problem staying in compliance.

Given that terrorist content is typically most harmful in the first hour of its appearance online and given the specific expertise and responsibilities of competent authorities and Europol, referrals should be assessed and, where appropriate, acted upon within one hour, as a general rule.

Yes, the EU Commission wants terrorist content vanished in under an hour and proclaims, without citing authorities, that the expertise of government agencies will make compliance un-impossible. The Commission also says it should be easy to keep removed content from popping up somewhere else, because it's compiled a "Database of Hashes."

Another bad idea that cropped up a few years ago makes a return in this Commission report. The EU wants to create intermediary liability for platforms under the concept of "duty of care." It would hold platforms directly responsible for not preventing the dissemination of harmful content. This would subject social media platforms to a higher standard than that imposed on European law enforcement agencies involved in policing social media content.

In order to benefit from that liability exemption, hosting service providers are to act expeditiously to remove or disable access to illegal information that they store upon obtaining actual knowledge thereof and, as regards claims for damages, awareness of facts or circumstances from which the illegal activity or information is apparent. They can obtain such knowledge and awareness, inter alia, through notices submitted to them. As such, Directive 2000/31/EC constitutes the basis for the development of procedures for removing and disabling access to illegal information. That Directive also allows for the possibility for Member States of requiring the service providers concerned to apply a duty of care in respect of illegal content which they might store.

This would apply to any illegal content, from hate speech to pirated content to child porn. All of it is treated equally under certain portions of the Commission's rules, even when there are clearly different levels of severity in the punishments applied to violators.

In accordance with the horizontal approach underlying the liability exemption laid down in Article 14 of Directive 2000/31/EC, this Recommendation should be applied to any type of content which is not in compliance with Union law or with the law of Member States, irrespective of the precise subject matter or nature of those laws...

The EU Commission not only demands the impossible with its one-hour takedowns, but holds social media companies to a standard they cannot possibly meet. On one hand, the Commission is clearly pushing for proactive removal of content. On the other hand, it wants tech companies to shoulder as much of the blame as possible when things go wrong.

Given that fast removal of or disabling of access to illegal content is often essential in order to limit wider dissemination and harm, those responsibilities imply inter alia that the service providers concerned should be able to take swift decisions as regards possible actions with respect to illegal content online. Those responsibilities also imply that they should put in place effective and appropriate safeguards, in particular with a view to ensuring that they act in a diligent and proportionate manner and to preventing [sic] the unintended removal of content which is not illegal.

The Commission follows this by saying over-censoring of content can be combated by allowing those targeted to object to a takedown by filing a counter-notice. It then undercuts this by suggesting certain government agency requests should never be questioned, but rather complied with immediately.

[G]iven the nature of the content at issue, the aim of such a counter-notice procedure and the additional burden it entails for hosting service providers, there is no justification for recommending to provide such information about that decision and that possibility to contest the decision where it is manifest that the content in question is illegal content and relates to serious criminal offences involving a threat to the life or safety of persons, such as offences specified in Directive (EU) 2017/541 and Directive 2011/93/EU. In addition, in certain cases, reasons of public policy and public security, and in particular reasons related to the prevention, investigation, detection and prosecution of criminal offences, may justify not directly providing that information to the content provider concerned. Therefore, hosting service providers should not do so where a competent authority has made a request to that effect, based on reasons of public policy and public security, for as long as that authority requested in light of those reasons.

These recommendations will definitely cause all kinds of collateral damage, mainly through proactive blocking of content that may not violate any EU law. It shifts all of the burden (and the blame) to tech companies with the added bonus of EU fining mechanisms kicking into gear 60 minutes after a takedown request is sent. The report basically says the EU Commission will never be satisfied by social media company moderation efforts. There will always be additional demands, no matter the level of compliance. And this is happening on a flattened playing field where all illegal content is pretty much treated as equally problematic, even if the one-hour response requirement is limited to "terrorist content" only at the moment.


Reader Comments

The First Word

Subscribe: RSS

View by: Time | Thread


  • icon
    Ninja (profile), 5 Mar 2018 @ 8:13am

    It will keep climbing down until the mandated removal interval is "instantly". Some platforms folded to the whining because of the money involved but they should've expected that once they gave their hands it would not stop until the whiners got the entire body.

    The whiners will keep whining or pushing for laws to the eternity. And everybody will lose, including those platforms.

    reply to this | link to this | view in chronology ]

    • icon
      Designerfx (profile), 5 Mar 2018 @ 9:49am

      Re:

      The real reason isn't quite the timeframe or terrorism. It's shifting the accountability to being an afterthought.

      reply to this | link to this | view in chronology ]

    • icon
      That One Guy (profile), 5 Mar 2018 @ 10:10am

      Re:

      It will keep climbing down until the mandated removal interval is "instantly".

      Depending on what you mean, it could be worse than that. The next step after 'as soon as it's put up it needs to be taken down' is that it's never allowed up in the first place.

      Everything must be vetted before it's allowed to be posted, no exceptions(well, other than 'official' content of course...), so that no 'terroristic content' can possible make it onto the platforms. That this will all but destroy them is a price the politicians involved are willing to have others pay.

      reply to this | link to this | view in chronology ]

  • icon
    Anonymous Anonymous Coward (profile), 5 Mar 2018 @ 9:08am

    Oh, but only if it were actually possible

    I am waiting for the day when such governmental 'requests' are backed up with the personnel and protocols to actually execute their insane demands, at their own cost (feel bad about the taxpayers though). Then, when THEY find out it isn't actually feasible, and we have the opportunity to enjoy the ensuing floor show, revel in the popcorn market spike.

    reply to this | link to this | view in chronology ]

    • identicon
      Anonymous Coward, 5 Mar 2018 @ 10:38am

      Re: Oh, but only if it were actually possible

      Sometimes I wonder if it's going to herald a return to the TV age. Maybe the telcos are hoping we'll all give up on the Internet out of sheer frustration if it becomes that strict?

      One-way communication with the people from "reliable sources" was the best thing, and regular plebs were tied to the "Letter to the Editor" or being interviewed on TV for expressing an outside (vetted) opinion.

      By that logic, it'd be much safer for everybody if everyone's words are checked over first, like they were back then. We can't let "dangerous"/"offensive" content on the Internet, at any cost to speaking freely! /sarcasm

      reply to this | link to this | view in chronology ]

      • identicon
        Anonymous Coward, 5 Mar 2018 @ 10:46am

        Re: Re: Oh, but only if it were actually possible

        I wonder if it's going to herald a return to the TV age

        You mean the prolefeed?

        reply to this | link to this | view in chronology ]

      • icon
        PaulT (profile), 6 Mar 2018 @ 1:40am

        Re: Re: Oh, but only if it were actually possible

        "Maybe the telcos are hoping we'll all give up on the Internet out of sheer frustration if it becomes that strict?"

        Honestly, that is what they really want. Corporations tied closely to the way things happened pre-Internet would love it if a broadcast model was the only one that was left to be viable, while the governments would love it if they could control the narrative like they used to.

        That's why there's such a divide between "silicon valley" and the telcos/government. The latter either don't understand the modern world or profited far easier from the way things used to be, while the former both understand it and have worked out how to leverage it to their benefit.

        reply to this | link to this | view in chronology ]

  • identicon
    Anonymous Coward, 5 Mar 2018 @ 9:36am

    Currently I think it will be impossible to manage. But I do see a way the might expedite the process and get the speeds that the government wants? Couldn't the government have a database of terroristic content that social media servers can reference to ban. Puts the work back on the government and will probably be much faster at pulling content down. You will still get false positives and I think the entire system is a dumb waste of time but makes the government do all the work which is how it should be in situations like this.

    reply to this | link to this | view in chronology ]

    • identicon
      Wendy Cockcroft, 6 Mar 2018 @ 5:38am

      Re:

      Oh dear...

      First of all, how would they populate their database? This content is created by individuals and groups who then post it to social media sites using video-sharing services. Therefore, the government is not going to get this first, they are.

      Due to the government's own rules this stuff has to be taken down within the hour, which is not enough time for it to be caught by the government, unless they issue an order that copies must be made of all items taken down and sent to themselves, thereby raising the cost of compliance.

      Therefore, there's no way of shifting the burden to the government as you describe it; they'd be reliant on ISPs to find, identify, store, and send the dodgy content in order to set up and maintain their database.

      reply to this | link to this | view in chronology ]

  • icon
    orbitalinsertion (profile), 5 Mar 2018 @ 9:37am

    Oh, it's crazy demands time, OK.

    Governments, we demand you keep terrorists off our sites and services. Please drone bomb them within an hour of us - or you! - finding any terrorist or child porn content on our sites. Kthxbye.

    reply to this | link to this | view in chronology ]

  • icon
    Roger Strong (profile), 5 Mar 2018 @ 9:39am

    By the same reasoning, this problem would go away if the EU Commission simply gave European governments one day to stop terrorism within their borders.

    reply to this | link to this | view in chronology ]

  • icon
    Designerfx (profile), 5 Mar 2018 @ 9:42am

    So why is this only for "Social media content"?

    If this was so magically important; wouldn't they be just as concerned with text messaging or newspaper or tv media?

    reply to this | link to this | view in chronology ]

    • identicon
      Anonymous Coward, 5 Mar 2018 @ 11:19am

      Re: So why is this only for "Social media content"?

      People participating in social media are freer to dispute and/or break established narratives from larger players. But if they call it dangerous, remove it in an hour, and the information won't spread that far. At least that's what I'm guessing is what they think.

      Keep the world moderately dangerous (or make words themselves "dangerous" under the law if your country is safer) and the censorship will make dumb people feel a false sense of security whilst losing their freedoms.

      reply to this | link to this | view in chronology ]

  • icon
    crade (profile), 5 Mar 2018 @ 10:04am

    It's just a pretense for protectionist measures. Europe just wants to hamstring foreign companies being able to operate in Europe and make it so only specialized "European version" is acceptable.

    reply to this | link to this | view in chronology ]

    • identicon
      Metaphor, 5 Mar 2018 @ 10:37am

      Re:

      Nah, just disconnect the EU from all social media platforms.
      Then, when their people revolt against their buffoonish overlords, they can come back.

      reply to this | link to this | view in chronology ]

  • icon
    TheResidentSkeptic (profile), 5 Mar 2018 @ 10:11am

    Time Travel is NOT possible...

    making all social media go away, and google stop indexing their sites, and all the rest of the internet go away is just NOT going to bring back the halcyon days of the 1950's and the supremacy of newspapers. No matter how they wish it would - their time in the sun has ended.

    The internet removed government control of the messengers; thus removing control of the message.

    And *that* genie is NOT going back into the bottle...

    So keep on trying to kill the internet one piece at a time... and it will keep up with its normal response:

    "Damage Detected. Re-Routing"

    reply to this | link to this | view in chronology ]

    • identicon
      Anonymous Coward, 6 Mar 2018 @ 8:51pm

      Re: Time Travel is NOT possible...

      To which the government response will increasingly be:

      "Non-Compliance Detected. Bankrupting / Litigating / Holding at gun point."

      Granted we're not at the last one on a widespread basis yet, but believe me, those pushing censorship will happily implement it given enough time. They always do.

      reply to this | link to this | view in chronology ]

  • identicon
    Anonymous Coward, 5 Mar 2018 @ 10:28am

    Especially interesting with respect to the new Polish law making it illegal to 'accuse' the Polish state of being complicit in Nazi war crimes. The law has taken effect despite heavy criticism from other EU countries. But any online discussion on the merits of this law would have to be shut down by social media companies if so directed by 'competent (Polish) authorities'.

    reply to this | link to this | view in chronology ]

  • identicon
    Anonymous Coward, 5 Mar 2018 @ 10:29am

    Freedom of speech

    Needs to be something the people believe in or this is what you get.

    Keep trying to tell people that the harder you fight what you think is wrong, the more damage you wind up doing to yourself. By all means, go ahead and grab a pound of meat and then stick your hand in with it to make sure all of it has been ground up. That is how stupid people are, on all sides of this issue.

    Right now, crazy has been called sane!

    reply to this | link to this | view in chronology ]

    • identicon
      Anonymous Coward, 5 Mar 2018 @ 6:46pm

      Re: Freedom of speech

      "Needs to be something the people believe in "

      Recent editorial claimed a large percentage of the public does not believe anything Donald has to say, and that this is dangerous. Many months ago I thought it was funny - now it is getting a bit scary.

      reply to this | link to this | view in chronology ]

  • identicon
    Metaphor, 5 Mar 2018 @ 10:35am

    Turn off Social Media in the EU...

    It's the only way to be certain.

    Seriously though, turn off the access to them from EU ip addresses.

    If they want to use the sites, they'll have to VPN in, and then the EU laws won't apply.

    reply to this | link to this | view in chronology ]

  • icon
    That One Guy (profile), 5 Mar 2018 @ 10:36am

    Pointy-haired boss syndrome...

    Where everything is easy when you don't have to do it.

    Be interesting if the companies in question were to flat out call the politicians out on this. Demand that if it's so easy to spot the content in question that they provide examples.

    Out of these 10 videos, which should be removed?

    How about out of 100?

    1,000?

    10,000?

    Keep in mind the allowed time-frame isn't any different from the first set to the last, you had one hour to go over ten videos, and you have one hour to go over ten thousand.

    If it's really so easy then they should have no problem keeping up with the task as it scales up to what they are demanding the companies manage.

    If the response when they fail is 'well hire more people!' demand that they flat out state how many people they think the companies should be obligated to hire, how many extra bodies they want warming seats, checking through everything to ensure only approved content is allowed to be posted.

    Add to the fact that the government is demanding direct, unquestionable censorship power in being able to state 'this is to be taken down, and you can not contest that decision', and this goes from 'really bad' to 'disastrously bad' mighty quick.

    reply to this | link to this | view in chronology ]

    • identicon
      Anonymous Coward, 5 Mar 2018 @ 10:53am

      Re: Pointy-haired boss syndrome...

      Some of the regulations around the internet are starting to look a lot like some of the law around copyrights with respect to libraries. Just as libraries would never be allowed to exist if they hadn't already been a longstanding thing before much of the law was put into place, it seems we're quickly reaching a point where the internet couldn't have ever become a thing if it had started in a later era.

      reply to this | link to this | view in chronology ]

    • icon
      Ryunosuke (profile), 5 Mar 2018 @ 10:54am

      Re: Pointy-haired boss syndrome...

      holy balls, I now see everyone claiming for "secure back doors" as the pointy haired boss, and everyone else is Dilbert, or at the very least Wally, and add in a couple of Loud Howards.

      reply to this | link to this | view in chronology ]

  • identicon
    Anonymous Coward, 5 Mar 2018 @ 10:50am

    Can't wait til this law is used to proactively block government websites and agencies. Double bonus if it's another government agency doing the requesting (probably from a different country).

    reply to this | link to this | view in chronology ]

  • identicon
    Anonymous Coward, 5 Mar 2018 @ 11:37am

    Did Spain have a huge influence on this law? It sounds like they want to stop dissent which would greatly help out Spain's government. If they did play a key part I would how much was motivated by revenge for google delisting news.

    reply to this | link to this | view in chronology ]

  • icon
    ECA (profile), 5 Mar 2018 @ 11:41am

    Who is this group??

    I suggest some of you look this up..

    THIS ISNT the EU..
    This is the group that is Supposed to be responsible from interactions and Trade between the EU countries..
    Its a BUNCH of WELL PAID, by Each of the EU states, persons that are Supposed to represent EACH EU state.(they love passing laws, for some stupid reason)

    1. this is 1 step of the concept of controlling NEWS/INFORMATION/Distribution.
    2. WHO THE F.. IS THIS??
    3. wHO's idea was this??

    NOW for the fun part..
    HOW BIG is google? Could we say that IF Google wanted to, that about 1/2 the internet would no longer be accessible to the EU??

    reply to this | link to this | view in chronology ]

  • identicon
    Anonymous Coward, 5 Mar 2018 @ 12:02pm

    This is why you don't give an inch to the left.. They really want 100 miles.

    reply to this | link to this | view in chronology ]

    • identicon
      Anonymous Coward, 5 Mar 2018 @ 6:49pm

      Re:

      This is why political parties are a very bad idea.

      reply to this | link to this | view in chronology ]

    • identicon
      Wendy Cockcroft, 6 Mar 2018 @ 2:43am

      Re:

      The authoritarians behind this, in Poland and in Spain, are right wing. Try again.

      reply to this | link to this | view in chronology ]

      • icon
        DNY (profile), 6 Mar 2018 @ 8:19pm

        Re: Re:

        Not likely. While Spain has some influence, Poland's influence on the European Commission is just about nil. A proposal this sweeping isn't going to come out of the Commission without France and Germany being behind it. The only authoritarian influence in play here is the EU itself.

        Quite frankly this sort of thing should make everyone in the UK glad that "Leave" won, no matter how rocky the change to trading with the EU under WTO rules proves to be. (Yes, I think it will come to that, precisely because of the "we are to be obeyed" attitude of the European Commission.)

        reply to this | link to this | view in chronology ]

        • icon
          PaulT (profile), 7 Mar 2018 @ 12:09am

          Re: Re: Re:

          "Quite frankly this sort of thing should make everyone in the UK glad that "Leave" won"

          Only if your delusional enough to think that similar and worse rulings won't come out of Westminster. Frankly, if you look at the history of the EU and the Tories, we have actually been rescued from far worse things already being made law with fewer protections for the public. Things will get worse for us, the only thing that will change is that people like you no longer have the EI boogey man to blame (although, no doubt, your tabloids will find a way to tell you to blame them anyway).

          reply to this | link to this | view in chronology ]

  • identicon
    tp, 5 Mar 2018 @ 12:42pm

    Companies themselves to blame

    > And every time companies failed to do the impossible, the EU Commission would appear on their virtual doorsteps, demanding they be faster and more proactive.

    The real reason is that these companies have taken responsibility of larger amount of content than they can actually handle properly. EU's demands are perfectly valid for smaller/quicker companies who don't have huge market reach. The large companies like facebook, google and twitter were just greedy when they spread their networks to the whole world. If they can't handle their market reach, they have alternative to reduce their global reach or hire more people to handle the problems. But any idea that EU's demands are somehow unreasonable simply because the large companies cannot fullfil the requirements is just crazy. It's the companies own problem that they wanted the whole world to use their platforms.

    reply to this | link to this | view in chronology ]

    • identicon
      Anonymous Coward, 5 Mar 2018 @ 1:15pm

      Re: Companies themselves to blame

      Think of the social media sites as pubs and cafes, where there is no expectation that the owner controls what is being discussed by its patrons, and indeed most of the tine does not even hear those conversations. They are not newspapers where an editor content to be published.

      reply to this | link to this | view in chronology ]

      • identicon
        tp, 5 Mar 2018 @ 1:36pm

        Re: Re: Companies themselves to blame

        > Think of the social media sites as pubs and cafes

        Owner of the pub will still call police every time you bring your semiautomatic machine gun to the party.

        reply to this | link to this | view in chronology ]

        • identicon
          Anonymous Coward, 5 Mar 2018 @ 1:57pm

          Re: Re: Re: Companies themselves to blame

          > Owner of the pub will still call police every time you bring your semiautomatic machine gun to the party.

          That's a great example. And in your example, the owner of the pub wouldn't be held liable for the actions of the patron who brought the semiautomatic machine gun to the party. Even if they had a habit of kicking out other people who they'd noticed had guns before. And pub owners are also not responsible for finding all semiautomatic machine gun's their patrons might bring within an hour. Which, if our analogy were to be made closer to the truth, the pub holds a few million people and thousands of them brought their black painted nerf guns.

          reply to this | link to this | view in chronology ]

          • identicon
            tp, 5 Mar 2018 @ 2:20pm

            Re: Re: Re: Re: Companies themselves to blame

            > And pub owners are also not responsible for finding all semiautomatic machine gun's their patrons might bring within an hour.

            Well, they actually are responsible if some lunatic shoots people in their premises. This is why there's professional guards in the door, so that they detect the guns before letting people inside the party place. Some parties even check for knives and other useful gadgets before letting partygoers to the partyplace.

            Organizers of large gatherings are obviously responsible if during the event, something bad happens and people get injured or dead.

            Obviously the damage with social media sites are different nature, and terrorist material has tendency to be spammed to large area of the world, in order to find the people who are maniputable to work for the terrorists. This is why the platforms need to be careful with the content quality before spamming the material to large groups of people.

            reply to this | link to this | view in chronology ]

            • identicon
              Anonymous Coward, 5 Mar 2018 @ 2:36pm

              Re: Re: Re: Re: Re: Companies themselves to blame

              Well, they actually are responsible if some lunatic shoots people in their premises. This is why there's professional guards in the door,

              So would hold you the school staff, and the cops guarding the place responsible for deaths in the Florida school shooting, as they failed to keep the gunman out?

              reply to this | link to this | view in chronology ]

              • identicon
                tp, 5 Mar 2018 @ 2:56pm

                Re: Re: Re: Re: Re: Re: Companies themselves to blame

                > the cops guarding the place responsible for deaths in the Florida school shooting, as they failed to keep the gunman out?

                Yes. If the professional security people can't do the job, who can? Of course it's a team effort in schools, so teachers can report if they detect people to go wrong direction, but there's always people who are responsible when things don't go as planned.

                There's a reason why security people get paid -- so that everyone else can feel safe and secure against large masses of people and whatever can happen when people go broken. Detecting and fixing the problems is responsibility of the professional guards, teachers, police and everyone else who can detect the activity.

                Social media companies are experts in social media's effects to the world, so they ought be to controlling whatever is happening in that area.

                Note that a pub will have one professional guard per 50 people visiting their premises, and I'm not sure if facebook and twitter has that many employees... Maybe they're running out of resources to handle their impact to the world.

                reply to this | link to this | view in chronology ]

                • identicon
                  Anonymous Coward, 5 Mar 2018 @ 6:10pm

                  Re: Re: Re: Re: Re: Re: Re: Companies themselves to blame

                  There is a reason security people get paid. So that when shit goes down, there is somebody on hand to deal with it. Not so they can minority report every possible instance of something going at all wrong.

                  And frankly this isn't akin to the cops guarding the place being responsible for deaths in the Florida school shooting. This is cops guarding the place being responsible for not putting every single individual on campus through an x-ray machine, pat down and full search every hour they remain on the premesis, with every word spoken scrubbed for anything law enforcement might not like with actual context, illegality or threat to safety be damned.

                  Where two students talking about a movie can be taken out of context to make them the next shooters. Where expressing displeasure with the existing administration being treated as a terrorist. All because someone in charge is 'just being careful', lest they be held personally responsible for that one conversation being the one in a million statement that preceeds an attack.

                  I am not using an accurate statistic, but giving an indication of scale. It is Facebooks responsibility to become investigative officers. It is Twitter's job to personally vet every insignificant cat-video and selfie. In the example of a bar, listening devices must be planted on every patron and every second of conversation listened to in real time to ensure that the patrons don't say something that might annoy someone who's ACTUAL JOB IT IS to do investigative work.

                  Would you want to see your bar have to hire enough people to put your patrons under more scrutiny than a max security prison? On your own dime? Because stopping one unacceptable comment or conversation is worth putting the dozens, hundreds, THOUSANDS of other conversations under the microscope?

                  In your example, the bar would shut down. The burden of liability to great to effectively police profitably, or frankly at all. You'd either see 'No talking' signs posted everywhere with demands every person entering the bar be strip searched, or you'd see the next bar fight land the bar owner in jail with assault charges for not pre-empting a confrontation. Because 'they were responsible for making everyone feel safe and secure'.

                  One might say 'Well they are being given an hour, it's not instantaneous' but (1) If companies bend to the 1 hour timeframe, you can bet the next step is for politicians to DEMAND everything be verified and vetted and approved before being put online and (2) if you compare the scale of conversations happening on social media to conversations happening at the bar or school in your example, the burden is... frankly still not equivocal. Companies are still given the more impossible task.

                  I'm sick of hearing governments demand everyone else do law enforcements job for it. Let's flip the scenario here.

                  A murderer got away with a crime? A theft occurred? Law enforcement has one hour, hell let's be generous, one day to solve the crime. Otherwise they are liable for the damage caused, especially if the same individual commits another crime. The burden is too great for law enforcement to keep up? Hire more people! Apparently it's that simple if platform holders are being held to such a standard! I mean, that's why we hire law enforcement right? If they can't do the job, who can? There's always people who are responsible when things don't go as planned. There's a reason why police get paid. So that everyone else can feel safe and secure against large masses of people and whatever can happen when people go broken. Detecting and fixing the problems is responsibility of law enforcement who cand etect the activity.

                  Law enforcement are experts in crime's effects to the world, so they ought to be controlling whatever is happening in that area.

                  reply to this | link to this | view in chronology ]

                  • icon
                    PaulT (profile), 6 Mar 2018 @ 3:22am

                    Re: Re: Re: Re: Re: Re: Re: Re: Companies themselves to blame

                    "Law enforcement has one hour, hell let's be generous, one day to solve the crime. Otherwise they are liable for the damage caused, especially if the same individual commits another crime"

                    Unfortunately, history has proven that when authorities are under pressure to get results, all it means is that innocent people are railroaded for crimes they didn't commit. Adding time pressure means they just try to get *any* conviction, not investigate the crime properly and find out who actually did it.

                    That's actually one of the problems here - social media companies will be forced to suppress a huge amount of legitimate speech because they're afraid of something slipping through without their control. It seems that the reality is beyond tp's ability to comprehend, but it"s not a question of manpower or the site's ability to process actual complaints.

                    reply to this | link to this | view in chronology ]

                    • identicon
                      Wendy Cockcroft, 6 Mar 2018 @ 5:46am

                      Re: Re: Re: Re: Re: Re: Re: Re: Re: Companies themselves to blame

                      In other words, it's a demand-side problem, and if history proves anything, it's that you can't solve social problems by banning stuff.

                      While it is entirely reasonable to expect ISPs to remove questionable content in a timely fashion it takes human eyeballs to decide what is or isn't acceptable within the company's TOS.

                      They're already frantically playing whack-a-mole and often deleting innocent content. It's unreasonable to expect them to speed the process up as it'd mean more innocent content gets taken down with the bad stuff.

                      Banning stuff is easy. It's a damn sight harder to tackle the social issues that drive people to terrorism, etc.

                      reply to this | link to this | view in chronology ]

                      • icon
                        PaulT (profile), 6 Mar 2018 @ 6:18am

                        Re: Re: Re: Re: Re: Re: Re: Re: Re: Re: Companies themselves to blame

                        "While it is entirely reasonable to expect ISPs to remove questionable content in a timely fashion..."

                        Well, that's really the issue, isn't it? Saying "you should remove content in a timely manner after it's been identified" is one thing, and an hour shouldn't be too much to ask under those circumstances.

                        However, what they're actually being asked to do is *identify* the content, then remove within an hour. That's a completely different request. This is where the disconnect lies, I think - some people don't realise that the first task takes a lot more than the second.

                        "Banning stuff is easy. It's a damn sight harder to tackle the social issues that drive people to terrorism, etc."

                        Which, frankly, is why we hear so much idiotic crap like this. It's far easier to point to a 3rd party and blame them and/or pass the responsibility to them than it is to fix root causes. The real fixes are both very long-term actions and fairly boring. It's much easier to get voted in based on easy "fixes" that do nothing than it is to tackle groundroots issues that will bear real fruit in a decade.

                        reply to this | link to this | view in chronology ]

                        • identicon
                          Wendy Cockcroft, 6 Mar 2018 @ 7:18am

                          Re: Re: Re: Re: Re: Re: Re: Re: Re: Re: Re: Companies themselves to blame

                          Indeed. And politicians wonder why we despise them so much! They need to become more willing to have those conversations.

                          reply to this | link to this | view in chronology ]

                • identicon
                  Anonymous Coward, 5 Mar 2018 @ 10:55pm

                  Re: Re: Re: Re: Re: Re: Re: Companies themselves to blame

                  Thank goodness you are not in a position where you would ever be considered an expert in any law.

                  Especially since the entire reason for your belief that fair use doesn't exist is because it takes too long to prove. You know what other bits of law take too long to prove? Try the whole damn thing...

                  reply to this | link to this | view in chronology ]

                • icon
                  PaulT (profile), 6 Mar 2018 @ 3:16am

                  Re: Re: Re: Re: Re: Re: Re: Companies themselves to blame

                  "Note that a pub will have one professional guard per 50 people visiting their premises"

                  Not where I live, they don't. Your premise is either very wrong, or you live somewhere unusually dangerous that doesn't translate to the rest of the real world

                  reply to this | link to this | view in chronology ]

            • icon
              PaulT (profile), 6 Mar 2018 @ 3:11am

              Re: Re: Re: Re: Re: Companies themselves to blame

              "Well, they actually are responsible if some lunatic shoots people in their premises. This is why there's professional guards in the door"

              As usual, thank fuck I don't live in a place where guns are so common that this is expected to be the norm. I'll happily wander in and out of my local pubs without having to pass through armed guards or risk being shot by other people who go there, thanks.

              reply to this | link to this | view in chronology ]

              • icon
                JoeCool (profile), 6 Mar 2018 @ 10:30am

                Re: Re: Re: Re: Re: Re: Companies themselves to blame

                As usual, he doesn't know what he's talking about. I've never HEARD of a bar with guards, much less one per fifty customers. Dance clubs will have a screener at the door to make sure there aren't too many people inside that it would violate fire regulations, or to check IDs if there is an age restriction to the club, but they aren't guards and don't protect anyone in the club. In a VERY dangerous neighborhood, the bartender will often be armed, but he's usually the only one.

                reply to this | link to this | view in chronology ]

                • icon
                  PaulT (profile), 6 Mar 2018 @ 12:39pm

                  Re: Re: Re: Re: Re: Re: Re: Companies themselves to blame

                  Oh, there are certainly bars with bouncers at the door and definitely clubs. Weekends on the early hours can get a little hairy, even in countries where nobody ever has a gun. I was in Glasgow this weekend just gone, and you had better believe that bouncers guard the city centre bars there, even with hotel bars, and often with good reason. But, no guns.

                  But, he said pubs. Even with cultural differences, if you're going to a pub that can't have more than 50 people there without people packing deadly weapons, you're either in a very, very bad part of town, or you're a lying asshole. I think I know which one he is.

                  reply to this | link to this | view in chronology ]

        • identicon
          Anonymous Coward, 5 Mar 2018 @ 2:17pm

          Re: Re: Re: Companies themselves to blame

          Offer conversations as the nearest analogue, and you bring up physical objects! Obviously the web site owner will try to call the cops if you carry a machine gun into their offices, but that is not the same as talking about them on their site

          reply to this | link to this | view in chronology ]

          • identicon
            Anonymous Coward, 5 Mar 2018 @ 6:51pm

            Re: Re: Re: Re: Companies themselves to blame

            When people do that, it's like they are admitting that they got nothin'.

            reply to this | link to this | view in chronology ]

  • identicon
    Anonymous Coward, 5 Mar 2018 @ 12:55pm

    >Given that terrorist content is typically most harmful in the first hour of its appearance online...

    Who, aside from a microcephalic lunatic with terminal hydrophobia would give that?

    It's most harmful in that brief period when only the people who already know about it can find it? Really? REALLY?

    Citation needed: not that we would believe the citation, but at least we would know that the source of the citation was a professional long-nosed incendiary-trousers prevaricator.

    reply to this | link to this | view in chronology ]

    • identicon
      Anonymous Coward, 5 Mar 2018 @ 1:59pm

      Re:

      And if that were true, then there'd be no point in removing it in that hour since the damage is clearly already done by the time it's removed.

      reply to this | link to this | view in chronology ]

  • identicon
    Wannabe Fourth Grade English Teacher, 5 Mar 2018 @ 1:28pm

    Typo in original posting...

    You wrote "...a new set of recommendations ... that sharply increases mandated response time."

    Likely you meant either "sharply DECREASES mandated response time," or "sharply increases mandated respose SPEED."

    reply to this | link to this | view in chronology ]

  • identicon
    Anonymous Coward, 5 Mar 2018 @ 1:46pm

    Are the politicians requiring moderators to be expert enough to pull every comment by haters of corrupt treasonous politicians?

    reply to this | link to this | view in chronology ]

  • identicon
    Anonymous Coward, 5 Mar 2018 @ 6:15pm

    These politicians clearly aren't familiar with the dickish traits of humanity. Put this into law and you will have at least thousands of people doing their best to post content that is so precariously on the fence, yet still perfectly legal, that the individual tasked with going over thousands of messages in the next hour just to keep their job won't know what the hell to do. Then that content gets censored. Then they toe THAT line. Then that content gets censored. The corporation gradually forced to ratchet up pre-emptive censorship to the point where my cat video is considered anti-semetic war-mongering hate speach because my cat's fur makes it look like it has a Hitler stache.

    reply to this | link to this | view in chronology ]

    • identicon
      tp, 6 Mar 2018 @ 12:05am

      Re:

      > where my cat video is considered anti-semetic war-mongering hate speach because my cat's fur makes it look like it has a Hitler stache.

      There's a thing about cat videos turning evil, and furs linked to ethics problems or being wrong in the internet; but whoever mentions hitler first in discussion is known to lose the discussion.

      reply to this | link to this | view in chronology ]

  • identicon
    Anonymous Coward, 5 Mar 2018 @ 8:30pm

    Whatever was already in place would continually be ratcheted up.

    Sounds a lot like copyright.

    reply to this | link to this | view in chronology ]

  • identicon
    Wendy Cockcroft, 6 Mar 2018 @ 2:41am

    Censorship by degrees

    This is nothing but a power-grab, an effort to rule the internet itself. They know damn well it's impossible to comply because this is a demand-side problem: people want to be gits online. How do you stop that at source, where the problem is?

    And how long till it extends to copyrighted items? That's where it's headed, people. Historically, mission creep towards copyright enforcement has always been the case.

    reply to this | link to this | view in chronology ]

  • identicon
    Anonymous Coward, 6 Mar 2018 @ 3:28am

    Just waiting to hear from Amber Rudd and her miracle £650K solution...

    reply to this | link to this | view in chronology ]

  • identicon
    Anonymous Coward, 6 Mar 2018 @ 3:47am

    This request is funny given the time it takes for them to make up their minds on anything let alone pass new lawful laws.

    reply to this | link to this | view in chronology ]

  • icon
    John85851 (profile), 6 Mar 2018 @ 10:10am

    What is terrorist content?

    Classifying something as "terrorist content" sounds like classifying any adult image as "porn": sure, there's the obvious stuff, but what about the not-so-obvious stuff? Who sets the rules?

    For example, if someone says "Death to all Christians", then that could probably be a terrorist threat.
    But if someone says "Death to all Muslims", then they're repeating what so many other people (and politicians) are thinking.
    Yet saying "death to anyone" should be treated the same.

    reply to this | link to this | view in chronology ]


Add Your Comment

Have a Techdirt Account? Sign in now. Want one? Register here
Get Techdirt’s Daily Email
Use markdown for basic formatting. HTML is no longer supported.
  Save me a cookie
Follow Techdirt
Techdirt Gear
Show Now: Takedown
Advertisement
Report this ad  |  Hide Techdirt ads
Essential Reading
Techdirt Deals
Report this ad  |  Hide Techdirt ads
Techdirt Insider Chat
Advertisement
Report this ad  |  Hide Techdirt ads
Recent Stories
Advertisement
Report this ad  |  Hide Techdirt ads

Close

Email This

This feature is only available to registered users. Register or sign in to use it.