Content Moderation Case Studies: Coca Cola Realizes Custom Bottle Labels Involve Moderation Issues (2021)
from the content-moderation-is-everywhere dept
Summary: Content moderation questions can come from all sorts of unexpected places — including custom soda bottle labels. Over the years, Coca Cola has experimented with a variety of different promotional efforts regarding more customized cans and bottles, and not without controversy. Back in 2013, as part of its “Share a Coke” campaign, the company offered bottles with common first names on the labels, which angered some who felt left out. In Israel, for example, people noticed that Arabic names were left off the list, although Coca Cola’s Swedish operation said that this decision was made after the local Muslim community asked not to have their names included.
This controversy was only the preamble to a bigger one in the summer of 2021, when Coca Cola began its latest version of the “Share a Coke” effort — this time allowing anyone to create a completely custom label up to 36 characters long. Opening up custom labels immediately raised content moderation questions.
Some people quickly noticed some surprising terms and phrases that were blocked (such as “Black Lives Matter”) while others that were surprisingly not blocked (like “Nazis”).
As CNN reporter Alexis Benveniste noted, it was easy to get offensive terms through the blocks (often with a few tweaks), and there were some eye-opening contrasts:
For example, “Black Lives Matter,” is blocked. But “White Lives Matter” isn’t. Coke included a special rainbow label for pride month, but you can’t write “Gay Pride” on the bottle. However, you can write “I hate gays.” “Hitler” and “Nazi” are banned, but users can customize bottles with the phrases, “I am Hitler” or “I am a Nazi.” — Alexis Benveniste
The fact that “I am Hitler” was allowed while “Hitler” by itself was not suggests that Coca Cola was using a filter that included blocking entire phrases, rather than just a list of words (enabling simple adjustments to get through — which might explain why “Nazi” is blocked but “Nazis” apparently was not).
Coca Cola insisted that the automated blocks in its web tool were not the only system of review, just the first filter before being passed on to production, and that “actual bottles are not made with words that are inconsistent with the program’s intent.”
- What kinds of tools, systems, staff, and processes should be put in place to deal with potential “abuse” of a custom labels program?
- How should the “intent” of the program be communicated to consumers who want their own bottles, but may ask for problematic content on the labels?
- How could the website more clearly inform consumers that the final text will still be reviewed by staff before production, so as not to let the public assume that if a word or phrase was not rejected in the web form, it will be printed?
- Customization systems are often put in place because they are considered fun and engaging, and a way for consumers to connect with a brand. How should companies weigh such benefits against the likelihood of abuse?
- How should companies wishing to use these types of customization options consider the potential consumer backlash to what those users believe is both over-moderation and under-moderation?
- As it becomes easier to mass produce customized products, how should companies set up campaigns to minimize possible abuses, while balancing the backlash if they disallow certain words or phrases that are important to certain groups?
Resolution: Coca Cola admitted to CNN that the process is constantly being adjusted. “We’re continuously refining and improving our Share A Coke personalization tool to ensure it is used only for its intended purpose.” The company also noted that it added language to the preview screen to say that “proposed language may require further review.” The company did not explain why terms like “Black Lives Matter” were not approved.
Originally posted to the Trust & Safety Foundation website.