Content Moderation At Scale Is Impossible; Naughty Kids In Wuhan Edition

from the masnick's-impossibility-theorem dept

I keep trying to point out that content moderation at scale is impossible to do well for a whole variety of reasons, including the fact that sooner or later some people -- or some large groups of people -- may try to game the system in totally unexpected ways. Witness this amusing example from the London Review of Books, reporting on the situation in Wuhan, China, which was ground zero for the Covid-19 coronavirus outbreak. With everything shut down in and around Wuhan, schools have moved to online learning -- and some naughty kids seem to have worked out a way to try to get out of having to do schoolwork: getting the app the schools rely on pulled from the app store via fake negative ratings.

Schools are suspended until further notice. With many workplaces also shut, notoriously absent Chinese fathers have been forced to stay home and entertain their children. Video clips of life under quarantine are trending on TikTok. Children were presumably glad to be off school – until, that is, an app called DingTalk was introduced. Students are meant to sign in and join their class for online lessons; teachers use the app to set homework. Somehow the little brats worked out that if enough users gave the app a one-star review it would get booted off the App Store. Tens of thousands of reviews flooded in, and DingTalk’s rating plummeted overnight from 4.9 to 1.4. The app has had to beg for mercy on social media: ‘I’m only five years old myself, please don’t kill me.’

Must tip my cap to the cleverness here, but on the content moderation side it shows, yet again, just how difficult it is to handle content moderation. No one running an app store or other platform prepares for a situation like this. In this case, at least, it seems likely that with so many negative reviews -- and now press attention -- the platform might take notice and discount the most recent thousands of reviews, but imagine having to keep track of every case where this is happening, often on a much smaller, less obvious, scale?

What seems easy about content moderation almost never is. Everyone seems to think it's easy until they're actually running a platform.

Filed Under: content moderation, content moderation at scale, coronavirus, covid-19, dingtalk, ios, remote learning, students, wuhan
Companies: apple

Reader Comments

Subscribe: RSS

View by: Time | Thread

  1. icon
    urza9814 (profile), 10 Mar 2020 @ 12:25pm

    Re: Re: The problem isn't the moderation

    If you don't trust the manufacturer of the app, why are you installing it? In most cases I trust them more than I trust Google.

    Google catches the obvious malware, but they're also the delivery system for the less obvious malware. Moderating for viruses is no easier than moderating for content (it's probably harder, as bad content often isn't trying to hide that fact.) and malware has gotten through in the past and will in the future. Better to download from a reputable source in the first place rather than downloading any random garbage that pops up in a search result and assuming it's safe.

Add Your Comment

Have a Techdirt Account? Sign in now. Want one? Register here

Subscribe to the Techdirt Daily newsletter

Comment Options:

  • Use markdown. Use plain text.
  • Remember name/email/url (set a cookie)

Follow Techdirt
Special Affiliate Offer

Report this ad  |  Hide Techdirt ads
Essential Reading
Techdirt Deals
Report this ad  |  Hide Techdirt ads
Techdirt Insider Chat
Report this ad  |  Hide Techdirt ads
Recent Stories
Report this ad  |  Hide Techdirt ads

This site, like most other sites on the web, uses cookies. For more information, see our privacy policy. Got it

Email This

This feature is only available to registered users. Register or sign in to use it.