Content Moderation At Scale Is Impossible; Naughty Kids In Wuhan Edition

from the masnick's-impossibility-theorem dept

I keep trying to point out that content moderation at scale is impossible to do well for a whole variety of reasons, including the fact that sooner or later some people -- or some large groups of people -- may try to game the system in totally unexpected ways. Witness this amusing example from the London Review of Books, reporting on the situation in Wuhan, China, which was ground zero for the Covid-19 coronavirus outbreak. With everything shut down in and around Wuhan, schools have moved to online learning -- and some naughty kids seem to have worked out a way to try to get out of having to do schoolwork: getting the app the schools rely on pulled from the app store via fake negative ratings.

Schools are suspended until further notice. With many workplaces also shut, notoriously absent Chinese fathers have been forced to stay home and entertain their children. Video clips of life under quarantine are trending on TikTok. Children were presumably glad to be off school – until, that is, an app called DingTalk was introduced. Students are meant to sign in and join their class for online lessons; teachers use the app to set homework. Somehow the little brats worked out that if enough users gave the app a one-star review it would get booted off the App Store. Tens of thousands of reviews flooded in, and DingTalk’s rating plummeted overnight from 4.9 to 1.4. The app has had to beg for mercy on social media: ‘I’m only five years old myself, please don’t kill me.’

Must tip my cap to the cleverness here, but on the content moderation side it shows, yet again, just how difficult it is to handle content moderation. No one running an app store or other platform prepares for a situation like this. In this case, at least, it seems likely that with so many negative reviews -- and now press attention -- the platform might take notice and discount the most recent thousands of reviews, but imagine having to keep track of every case where this is happening, often on a much smaller, less obvious, scale?

What seems easy about content moderation almost never is. Everyone seems to think it's easy until they're actually running a platform.

Filed Under: content moderation, content moderation at scale, coronavirus, covid-19, dingtalk, ios, remote learning, students, wuhan
Companies: apple


Reader Comments

Subscribe: RSS

View by: Time | Thread


  1. identicon
    Anonymous Coward, 10 Mar 2020 @ 10:57am

    This is why I still think that a up or down choice works better than a rating. Providing people with a way to amplify their own like (or dislike) of something leads to a skewing of results.


Add Your Comment

Have a Techdirt Account? Sign in now. Want one? Register here



Subscribe to the Techdirt Daily newsletter




Comment Options:

  • Use markdown. Use plain text.
  • Remember name/email/url (set a cookie)

Follow Techdirt
Insider Shop - Show Your Support!

Advertisement
Report this ad  |  Hide Techdirt ads
Essential Reading
Techdirt Deals
Report this ad  |  Hide Techdirt ads
Techdirt Insider Chat
Advertisement
Report this ad  |  Hide Techdirt ads
Recent Stories
Advertisement
Report this ad  |  Hide Techdirt ads

This site, like most other sites on the web, uses cookies. For more information, see our privacy policy. Got it
Close

Email This

This feature is only available to registered users. Register or sign in to use it.