Content Moderation Case Study: Profanity Filter Causes Problems At Paleontology Conference (October 2020)
from the f-that dept
Summary: With the COVID pandemic still in full force, the Society of Vertebrate Paleontology moved its annual meeting online. The event was due to run for an entire week, but early issues caused attendees and moderators to question the contents of the pre-packaged content filter provided by Convey Services, which operated the virtual meeting software.
The effort to ensure followup Q&A sessions would be free of profanity and other disruptiveness went awry when terms commonly used by paleontologists got caught in the software’s filter. While words like “pubic,” “bone,” or “hell” might be appropriately blocked elsewhere, the blocklisting of these words disrupted the conference the software was supposed to keep from being disrupted.
?I would hope that actual swears or slurs would be censored, since paleontology is not a field that’s immune to racist/sexist jerks,? noted Brigid Christison, a masters’ student in biology at Carleton University, in an email.
?Words like ?bone,? ?pubic,? and ?stream? are frankly ridiculous to ban in a field where we regularly find pubic bones in streams,? Christison said.
More problematically, attendees noted the software blocked “Wang” but left “Johnson” intact, despite both being sexual slang. This appeared to indicate some bias on the part of Convey’s blocklist creators — a bias that effectively erased a surname belonging to more than 90 million Chinese citizens.
Decisions to be made by Convey:
- Are generic ban lists effective enough in most situations to justify less content-specific alterations during more specialized use?
- Given the global reach of the internet, is it still acceptable to block common surnames that can also be deployed as sexual slang?
- Does providing separate keyword sets for different uses reduce the profitability of software sales and licenses?
Questions and policy implications to consider:
- Do word ban lists raise many issues in more general use or are problems noticed more often when the software is deployed by entities that deal with specialized subject matter?
- Does allowing licensees to alter keywords as needed eliminate some of the problems caused by non-specific ban lists?
- Is shifting the cost of moderation to customers a solid business decision?
Resolution: The Society was able to alter the ban list to better fit the subject matter once the problem was noticed. It continued to edit the keywords provided by Convey, reducing the probability of overblocking as the week rolled on. Because the alterations could be made on the client side, disruption was minimal, if inadvertently comical.
Originally posted to the Trust & Safety Foundation website.