Zoom & China: Never Forget That Content Moderation Requests From Government Involve Moral Questions

from the lessons-to-learn dept

If you've been around the content moderation/trust and safety debates for many years, you may remember that in the early 2000s, Yahoo got understandably slammed for providing data to the Chinese government that allowed the government to track down and jail a journalist who was critical of the Chinese government. This was a wake up call for many about the international nature of the internet -- and the fact that not every "government request" is equal. This, in fact, is a key point that is often raised in discussions about new laws requiring certain content moderation rules to be followed -- because not all governments look at content moderation the same way. And efforts by, say, the US government to force internet companies to "block copyright infringement" can and will be used by other countries to justify censorship.

The video conferencing software Zoom is going through what appears to be an accelerated bout of historical catch-up as its popularity has rocketed thanks to global pandemic lockdown. It keeps coming across problems that tons of other companies have gone through before it -- with the latest being, as stated above, that requests from all governments are not equal. It started when Zoom closed the account of a US-based Chinese activist, who used Zoom to hold an event commemorating the Tiananmen Square massacre. Zoom initially admitted that it shut the account to "comply" with a request from the Chinese government:

“Just like any global company, we must comply with applicable laws in the jurisdictions where we operate. When a meeting is held across different countries, the participants within those countries are required to comply with their respective local laws. We aim to limit the actions we take to those necessary to comply with local law and continuously review and improve our process on these matters. We have reactivated the US-based account.”

That response did not satisfy anyone and as more and more complaints came in, Zoom put out a much better response, which more or less showed that they're coming up to speed on a ton of lessons that have already been learned by others (at the very least, it suggests they should hire some experienced trust and safety staffers...). At the very least, though, Zoom admits that in taking the Chinese government's requests at face value, it "made two mistakes":

We strive to limit actions taken to only those necessary to comply with local laws. Our response should not have impacted users outside of mainland China. We made two mistakes:

  • We suspended or terminated the host accounts, one in Hong Kong SAR and two in the U.S. We have reinstated these three host accounts.
  • We shut down the meetings instead of blocking the participants by country. We currently do not have the capability to block participants by country. We could have anticipated this need. While there would have been significant repercussions, we also could have kept the meetings running.

There are reasonable points to be made that a company like Zoom should have anticipated issues like this, but at the very least you can give the company credit for admitting (directly) to its mistakes, and coming up with plans and policies to avoid doing it again in the future.

But there is a larger point here that often gets lost in all these discussions about trust and safety, and content moderation. So much of the debates usually focus on the assumption that (1) requests to block or take down content or accounts are done in good faith, and (2) that those making the requests have similar values. That's frequently not the case at all. We've shown this over and over again here on Techdirt in which laws against "fake news" are used to silence critics of a ruling class.

So for anyone pushing for laws that require internet companies to somehow ban or block "bad behavior" and "bad actors," you need to be able to come up with a definition of those things that won't be abused horribly by authoritarian governments around the globe.

Filed Under: activism, china, content moderation, content moderation at scale, government requests, jurisdiction, moral choices, tiananmen square
Companies: zoom


Reader Comments

Subscribe: RSS

View by: Time | Thread


  • icon
    Anonymous Anonymous Coward (profile), 12 Jun 2020 @ 4:07pm

    The ain't seen nothin' yet

    Just how will Zoom handle the request to shut down a meeting when some IP maximalist discovers that one of the meeting participants has a radio playing in the background with some music (already paid for BTW) they claim to own the copyright on? That the cows might hear it has been used already, but I am sure the maximalists will come up with some better excuse to make the claim that it is a 'secondary' performance that requires additional fees.

    Not being a user, can one record a Zoom meeting, then post it to YouTube? There, the problem would be the YouTube user for posting it (with YouTube in the middle of course) but I could see a dedicated maximalist trying to draw Zoom into things for having aided and abetted the recording.

    /serious or /sarcastic...could go either way

    reply to this | link to this | view in chronology ]

    • identicon
      Anonymous Coward, 12 Jun 2020 @ 7:17pm

      Re: The ain't seen nothin' yet

      Answer:
      Yes, a registered account (vs free access) can record a zoom meeting. It is not particularly well compressed, but you can do this thing.

      And once you have the recording, you can do with it anything you could do with other such recordings.

      You still have the problem of China eg ordering the video blocked in China via ContentID, but then you always have that. You also have the normal maximalist problems of "if there is background music, then it must be banned" sort of thing, or "we reported using this footage so it all belongs to us now".

      reply to this | link to this | view in chronology ]

  • identicon
    Glenn, 12 Jun 2020 @ 4:47pm

    People like to pretend that "bad" is an absolute term. It is, however, only a relative term. It's true meaning is this: the opposite of what the person using the term views as "good" and "right".

    reply to this | link to this | view in chronology ]

  • icon
    Upstream (profile), 12 Jun 2020 @ 4:50pm

    At least they are trying

    I have not participated in Zoom, for the same reasons I don't participate in Facebook, Twitter, or the like. Micah Lee had some unflattering things to say about Zoom here and here. What I found on Zoom's Privacy and Security page seemed seemed to be largely non-specific fluff, and the term "metadata" is not mentioned at all.

    I do find it very refreshing and encouraging that, instead of stonewalling or engaging in double-speak, they have been straight-forward about admitting errors, and, from this article and other things I have read about Zoom, they are indeed trying to "catch up," and that is a very good sign.

    reply to this | link to this | view in chronology ]

  • identicon
    Anonymous Coward, 12 Jun 2020 @ 4:58pm

    Yep, the US is not in the PRC's jurisdiction. At least you copped to it after the fact.

    From another angle, China could have just blocked participants inside its jurisdiction itself.

    reply to this | link to this | view in chronology ]

  • icon
    vilain (profile), 12 Jun 2020 @ 6:41pm

    Could it be because their dev team is in China?

    Zoom has said previously when they had to consider their security model in light increased free tier use that a lot of their dev team is in China. I wonder if their initial willingness to please the PRC was so their dev team stays in place and isn't arrested, tried, and thrown in prison.
    <p>
    WebEx originally came from a Chinese dev team and they still develop that product, so Cisco has the same problem.

    reply to this | link to this | view in chronology ]

  • identicon
    Anonymous Coward, 12 Jun 2020 @ 6:46pm

    The problem is western company's have to comply with laws in Russia and China, or their apps will be blocked and no one will get to use them. So they choice is block or censor some users or lose all your customers in that country.
    Laws about copyright can be used to censor users or
    remove content eg Many videos on YouTube that are legal in terms of fair use and commentary are taken down by using dmca eg a 2hour video can be taken
    down because it has a few seconds of music in it.
    Zoon is not like YouTube, most videos are only seen by the people who are in the meeting, you can only
    join the meeting using a password. In most cases there is no reason to record the video.
    Most videos on YouTube can be viewed by anyone unless the video is set to private.
    The reason Zoon took off is because is because its easy to use by anyone , Skype has been redesigned
    a number of times which makes it hard to use
    and you need to sign up with Microsoft to use it.
    I don't think music company's will start going after
    Zoom since most of the videos are not public
    eg they don't qualify as a public performance

    reply to this | link to this | view in chronology ]

  • This comment has been flagged by the community. Click here to show it
    icon
    tz1 (profile), 13 Jun 2020 @ 6:22am

    They just need a rage mob.

    The problem is there is already a defacto ban on some speech and points of view, generally conducted by a rage mob reporting that some post is "icky", and the person is "icky" and being "icky" goes against something in the terms of service. And the terms of service are intentionally kept ambiguous so the enforcement will be arbitrary. These contracts of adhesion are exactly like the laws they propose to stop "icky" speeck from the other side. They usually use the term "offensive", but then say it is hateful too, but it only goes one way. The enforcers almost all voted for Hillary, and those that didn't voted for the Green party or are still far to the left. They ban what offends them personally, not what might violate the actual rules, but the ambiguity gives them cover since they never have to explained what terms of service, or how they were violated, just that they did. That would happen with any attempt with a law, but it is happening now. And usually just after some rage mob wants someone's scalp. Does anyone remember what SPECIFIC thing Alex Jones was banned everywhere for (including Apple, apparently it went up to Tim Cook)? Sarah Jeongs tweets were more offenive than anything Milo tweeted, but Milo got banned. Jeongs tweets AFAIK are still up. Candace Owens simply did a find and replace on white to black and those otherwise identical tweets earned her a suspension. Few if any complain about vertical moderation - offensive terms, profanity, vulgarity, language suggesting violence, etc. like a G v.s. PG v.s. PG13 vs R vs X, as the rules can be fairly clear. Even Gab does that to some extent with its NSFW flag (as does Minds and Parler IIRC). The problem is the horizontal moderation where the ban limit is anything to the right of a left moving boundary. Or vice versa, but I know of no platform which does so.

    reply to this | link to this | view in chronology ]

    • icon
      Toom1275 (profile), 13 Jun 2020 @ 8:32am

      Re: They just need a rage mob.

      [Projects facts not ine vidence]

      reply to this | link to this | view in chronology ]

    • identicon
      Anonymous Coward, 13 Jun 2020 @ 8:47am

      Re: They just need a rage mob.

      wtf are you talking about

      reply to this | link to this | view in chronology ]

    • identicon
      Anonymous Coward, 13 Jun 2020 @ 11:16am

      Re: They just need a rage mob.

      The problem is there is already a defacto ban on some speech and points of view,

      Can you set up your own blog, or find a platform that accepts those points of view? If so, there is no ban on some speech. If you cannot attract an audience, or the platform has a small number of users, then you have a measure of the popularity of such speech.

      reply to this | link to this | view in chronology ]

      • identicon
        Anonymous Coward, 13 Jun 2020 @ 11:23am

        Re: Re: They just need a rage mob.

        Addendum: if you do not like the users of those platforms that accept you speech, they you know how others see you.

        reply to this | link to this | view in chronology ]

      • identicon
        Anonymous Coward, 13 Jun 2020 @ 12:35pm

        Re: Re: They just need a rage mob.

        If you cannot attract an audience, or the platform has a small number of users, then you have a measure of the popularity of such speech.

        Or maybe you just have an indication of how Google's algorithm ranks the site or the speech.

        reply to this | link to this | view in chronology ]

        • icon
          Toom1275 (profile), 13 Jun 2020 @ 12:50pm

          Re: Re: Re: They just need a rage mob.

          By popularity then relevance; i.e. like he just said.

          reply to this | link to this | view in chronology ]

        • identicon
          Anonymous Coward, 13 Jun 2020 @ 1:26pm

          Re: Re: Re: They just need a rage mob.

          People recommending a site will increase its popularity, and google ranking. However, those whose speech is unpopular will seek any reason other than people not wanting to listen to explain their own lack of popularity.

          reply to this | link to this | view in chronology ]

  • This comment has been flagged by the community. Click here to show it
    identicon
    Anonymous Coward, 13 Jun 2020 @ 7:09am

    TechDirtTopia

    Monopoly capital/imperialism is an irrational system. It is not
    organized to meet human needs. It is run by a very small ruling class whose
    only morality is the morality of the maximum profit.

    Join us in MASNICKI, a new independent country, located in Masnick's basement apartment. We put pillows all around the edges, and covered the whole thing with a blanket, and now we have an INDEPENDENT COUNTRY! Yesiree Bahgdad Bob, we are INDEPENDENT!

    WE HAVE A DREAM! THE TECHDIRT DREAM! WE SHALL OVERCOME!

    reply to this | link to this | view in chronology ]

    • icon
      Scary Devil Monastery (profile), 15 Jun 2020 @ 6:13am

      Re: TechDirtTopia

      Reduced to childish tantrums again, Baghdad Bob. Must mean it's soon time for the Hamilton persona to take over again and start screaming, eh?

      reply to this | link to this | view in chronology ]

      • icon
        Toom1275 (profile), 15 Jun 2020 @ 7:59am

        Re: Re: TechDirtTopia

        At least his meth dealer has been able to keep business up during the pandemic. Thank's to Spamiltard's posts, we can tell his shipment schedule.

        reply to this | link to this | view in chronology ]

        • icon
          Scary Devil Monastery (profile), 16 Jun 2020 @ 2:29am

          Re: Re: Re: TechDirtTopia

          So when he's sullenly whining all the time and calling people bad names half-heartedly he's on withdrawal and when he churns out wordwalls which degenerate in incoherent screaming and actual death threats he's high as a kite?

          Well, I would have figured it was all some form of schizophrenia but your hypothesis is plausible enough.

          reply to this | link to this | view in chronology ]

          • icon
            Toom1275 (profile), 16 Jun 2020 @ 8:49am

            Re: Re: Re: Re: TechDirtTopia

            Meth use can cause paranoid delusions, including of hallucinatory gangs of unknown people attacking the user.

            I've seen stories including a guy caught trying to break into a victim's house claiming he needed safety from the gang of mexicans chasing him down the (empty) street, kids flipping out and going siege-mode against an invisible enemy and jnstead shooting each other, to imagining that occasiinal welfare checks by police are actually a conspiracy of harrassment plotted against him.

            reply to this | link to this | view in chronology ]

  • icon
    That One Guy (profile), 13 Jun 2020 @ 12:52pm

    Still sound like a good idea?

    So for anyone pushing for laws that require internet companies to somehow ban or block "bad behavior" and "bad actors," you need to be able to come up with a definition of those things that won't be abused horribly by authoritarian governments around the globe.

    Or as I like to call it, the 'Turnabout is fair play' test.

    'Would you still support a position, argument and/or power if your absolute worst enemy was able to make use of it, potentially against you, to it's fullest extent the second it was available? If not then you probably shouldn't be in favor of it.

    reply to this | link to this | view in chronology ]

  • identicon
    Pixelation, 13 Jun 2020 @ 5:40pm

    This was not simply content moderation. This was censorship.

    reply to this | link to this | view in chronology ]


Add Your Comment

Have a Techdirt Account? Sign in now. Want one? Register here



Subscribe to the Techdirt Daily newsletter




Comment Options:

  • Use markdown. Use plain text.
  • Remember name/email/url (set a cookie)

Close

Add A Reply

Have a Techdirt Account? Sign in now. Want one? Register here



Subscribe to the Techdirt Daily newsletter




Comment Options:

  • Use markdown. Use plain text.
  • Remember name/email/url (set a cookie)

Follow Techdirt
Special Affiliate Offer

Advertisement
Report this ad  |  Hide Techdirt ads
Essential Reading
Techdirt Deals
Report this ad  |  Hide Techdirt ads
Techdirt Insider Chat
Advertisement
Report this ad  |  Hide Techdirt ads
Recent Stories
Advertisement
Report this ad  |  Hide Techdirt ads

This site, like most other sites on the web, uses cookies. For more information, see our privacy policy. Got it
Close

Email This

This feature is only available to registered users. Register or sign in to use it.