Techdirt's think tank, the Copia Institute, is working with the Trust & Safety Professional Association and its sister organization, the Trust & Safety Foundation, to produce an ongoing series of case studies about content moderation decisions. These case studies are presented in a neutral fashion, not aiming to criticize or applaud any particular decision, but to highlight the many different challenges that content moderators face and the tradeoffs they result in. Find more case studies here on Techdirt and on the TSF website.

Content Moderation Case Study: Ask.fm Responds After A Teen's Suicide Is Linked To Bullying On The Site (August 2013)

from the difficult-content-moderation-questions dept

Summary: After a UK teen took her own life in response to bullying on social networking site, ask.fm, her father asked both the site and the UK government to take corrective measures to prevent further tragedies. This wasn't an isolated incident. Reports linked multiple suicides to bullying on the teen-centered site.

Ask.fm's problems with bullying and other abuse appeared to be far greater than those observed on other social media sites. Part of this appeared to be due to the site's user base, which was much younger than more-established social media platforms. This -- combined with the option to create anonymous accounts -- seemed to have made ask.fm a destination for abusive users. What moderation existed before these problems became headline news was apparently ineffective, resulting in a steady stream of horrific stories until the site began to make serious efforts to curb a problem now too big to ignore.

Ask.fm's immediate response to both the teen's father and UK Prime Minister David Cameron's criticism (Cameron called for a boycott of the site) was to point to existing moderation efforts put in place to deter bullying and other terms of service violations.

After major companies pulled their advertising, ask.fm pledged to assist police in investigating the circumstances behind the teen's suicide, as well as consult with a law firm to see if moderation efforts could be improved. It also hired more moderators and a safety officer, and made its "Report" button more prominent.

More than a year after ask.fm became the target of criticism around the world, the site implemented its first Safety Advisory Board. The group of experts on teens and their internet use was tasked with reducing the amount of bullying on the platform and making it safer for its young users.

More significantly, ask.fm's founders -- who were viewed as unresponsive to criticism -- were removed by the site's new owners, InterActiveCorp (IAC). IAC pledged to work more closely with US law enforcement and safety experts to improve moderation efforts.

Decisions to be made by ask.fm:

  • Should anonymous accounts be eliminated (or stop-gapped by gathering IP address/personal info) to limit abusive behavior?
  • Does catering to a younger user base create unique problems not found at sites that skew older?
  • Would more transparency about moderation efforts/features nudge more users towards reporting abuse?
  • Should the site directly intervene when moderators notice unhealthy/unwanted user interactions?
Questions and policy implications to consider:
  • Given the international reaction to the teen's suicide, does a minimal immediate response make the perceived problem worse?
  • Does having a teen user base increase the risk of direct regulation or unfavorable legislation, given the increased privacy protections for minors in many countries?
  • Are moderation efforts resulting from user reports vetted periodically to ensure the company isn't making bullying/trolling problems worse by allowing abusive users to get others suspended or banned?
Resolution: When immediate steps did little to deter criticism, ask.fm formed a Safety Committee and, ultimately, dismissed founders that appeared to be unresponsive to users' concerns. The site made changes to its moderation strategies, hired more moderators, and made users more aware of the features they could use to report users and avoid unwanted interactions.

Hide this

Thank you for reading this Techdirt post. With so many things competing for everyone’s attention these days, we really appreciate you giving us your time. We work hard every day to put quality content out there for our community.

Techdirt is one of the few remaining truly independent media outlets. We do not have a giant corporation behind us, and we rely heavily on our community to support us, in an age when advertisers are increasingly uninterested in sponsoring small, independent sites — especially a site like ours that is unwilling to pull punches in its reporting and analysis.

While other websites have resorted to paywalls, registration requirements, and increasingly annoying/intrusive advertising, we have always kept Techdirt open and available to anyone. But in order to continue doing so, we need your support. We offer a variety of ways for our readers to support us, from direct donations to special subscriptions and cool merchandise — and every little bit helps. Thank you.

–The Techdirt Team

Filed Under: bullying, case study, content moderation, suicide
Companies: ask.fm


Reader Comments

Subscribe: RSS

View by: Time | Thread


  • icon
    Ann Brush (profile), 30 Sep 2020 @ 4:43pm

    Has the bullying situation improved?

    Following the suicide some assessment of bullying and reporting thereof would be very illuminating. I see what they have done and some policy and position considerations but what’s missing are metrics regarding wether the bullying situation improved in the time since the suicide during which all these efforts were made. Is what they are doing making any difference?

    reply to this | link to this | view in chronology ]

  • identicon
    Anonymous Coward, 30 Sep 2020 @ 5:31pm

    UK Prime Minister David Cameron's criticism involved asking parents to make sure their kids weren't being dicks online.

    Oh wait, no, that isn't what happened at all.

    reply to this | link to this | view in chronology ]

    • icon
      crade (profile), 1 Oct 2020 @ 7:44am

      Re:

      No, obviously if their kids aren't dicks on that specific platform then the problem is solved.. Not like they could just move their bad behaviour elsewhere and do the same thing at school, sports events, parks, etc etc

      reply to this | link to this | view in chronology ]

  • identicon
    Anonymous Coward, 1 Oct 2020 @ 11:52am

    How about as parents, we monitor our kids social media accounts and internet browsing. I know shocking, we can just blame the companies that run said social media/websites instead. Shocking.

    reply to this | link to this | view in chronology ]


Add Your Comment

Have a Techdirt Account? Sign in now. Want one? Register here



Subscribe to the Techdirt Daily newsletter




Comment Options:

  • Use markdown. Use plain text.
  • Remember name/email/url (set a cookie)

Close

Add A Reply

Have a Techdirt Account? Sign in now. Want one? Register here



Subscribe to the Techdirt Daily newsletter




Comment Options:

  • Use markdown. Use plain text.
  • Remember name/email/url (set a cookie)

Follow Techdirt
Essential Reading
Techdirt Insider Chat
Recent Stories

Close

Email This

This feature is only available to registered users. Register or sign in to use it.