Trust & Safety Professional Association Launches: This Is Important
from the exciting-news dept
One of the most frustrating things out there is the idea that content moderation choices made on various platforms are coming directly from the top. Too often, I’ve seen people blame Jack Dorsey or Mark Zuckerberg for content moderation decisions, as if they’re sitting there at their laptops and twiddling their fingers over who gets blocked and who doesn’t. Over the last decade or so, an entire industry has been built up to figure out how to make internet services as usable as possible, to deal with spam, and abuse, and more. That industry is generally called “trust and safety,” and as a new industry it has grown up and professionalized quite a bit in the last decade — though it rarely (if ever) gets the respect it deserves. As I mentioned on a recent episode of The Pivot podcast, many of the assumptions that people make about content moderation unfairly malign the large crew of people working in trust and safety who aren’t interested in political bias, or silencing voices, but who legitimately are working very, very hard to figure out how to balance the many, many tradeoffs in trying to make internet services useful and welcoming to users.
Today, we?re pleased to announce the Trust & Safety Professional Association (TSPA) and the Trust & Safety Foundation Project (TSF).* TSPA is a new, nonprofit, membership-based organization that will support the global community of professionals who develop and enforce principles and policies that define acceptable behavior online. TSF will focus on improving society?s understanding of trust and safety, including the operational practices used in content moderation, through educational programs and multidisciplinary research. Neither TSPA nor TSF are lobbying organizations, and will not advocate for public policy positions on behalf of corporate supporters or anyone else. Instead, we will support the community of people doing the work, and society?s understanding of it.
And I should note that the people behind this organization are incredible. If you told me about such an organization and asked me to suggest who should be involved, I would have included exactly the people who put this together, starting Adelin Cai and Clara Tsao, who both have tremendous experience in the trust and safety space, and the knowledge and thoughtful, balanced approach necessary to build organizations like the two launched today. If you ever need someone to talk through all the challenges to think through in building a successful trust and safety team, I’d highly recommend both Adelin and Clara. The board also includes some names you may recognize, including Professor Eric Goldman, former Twitter/Google lawyer and White House deputy CTO Alex Macgillivray, and former Mozilla Chief Legal Officer/COO and current Stellar Development Foundation CEO Denelle Dixon.
And… one of the initial projects that the Trust & Safety Foundation has launched is an ongoing series of trust and safety case studies written by… us. Techdirt’s think tank arm, the Copia Institute, will be providing a series of trust and safety case studies to the Trust & Safety Foundation, which they’ll be posting each week. We’ll eventually be posting many of them to Techdirt as well, so you can expect those coming later this summer. The point of this library of case studies is to give people a better understanding of the impossible choices and tradeoffs that internet services need to make on a daily basis, and to highlight why what often seems like an “obvious” way to deal with some piece of content may not be so obvious once you explore it from all sides. Personally, I’m excited to get to help build out this library and to work with such a great team of people who are devoted to improving and professionalizing the space, while further educating everyone (both inside and outside the trust and safety space) how trust and safety efforts actually work.