adelin.cai's Techdirt Profile

adelin.cai

About adelin.cai

Posted on Techdirt - 28 August 2020 @ 12:00pm

The Trust & Safety Professional Association: Advancing The Trust And Safety Profession Through A Shared Community Of Practice

For decades, trust and safety professionals in content moderation, fraud and risk, and safety — have faced enormous challenges, often under intense scrutiny. In recent years, it’s become even more clear that the role of trust and safety professionals are both critically important and difficult. In 2020 alone, we’ve seen an increasing need for this growing class of professionals to combat a myriad of online abuse related to systemic racism, police violence, and COVID-19 — such as hate speech, misinformation, price gouging, and phishing — while keeping a safe space for connecting people with vital, authoritative information, and with each other.

Despite the enormous impact trust and safety individuals have towards protecting the online and offline safety of people, the professional community has historically been dispersed, siloed, and informally organized. To date — unlike, say, in privacy — no organization has focused on the needs of trust and safety professionals in a way that builds a shared community of practice.

This is why we founded the Trust & Safety Professional Association (TSPA) and the Trust & Safety Foundation Project (TSF) — something we think is long overdue. TSPA is a new, nonprofit, membership-based organization that will support the global community of professionals who develop and enforce principles and policies that define acceptable behavior online. TSF will focus on improving society’s understanding of trust and safety, including the operational practices used in content moderation, through educational programs and multidisciplinary research.

Since we launched in June, we’ve gotten a number of questions about what TSPA and TSF will (and won’t) do. So we thought we’d tackle them right here, and share more with you about who’s included, why we launched now, and what our vision is for the future. You can also hear us talk more about both organizations on episode 247 of the Techdirt podcast. And if you want to know even more, we’re all ears!

Q&A

Q. How do you define trust and safety? Don’t you mean content moderation?

We define trust and safety professionals as the global community of people who develop and enforce policies that define acceptable behavior online.

Content moderation is a big part of trust and safety, and the area that gets the most public attention these days. But trust and safety also includes the people who tackle financial risk and fraud, those who process law enforcement requests, engineers who work on automating these policies, and more. TSPA is for the professionals who work in all of those areas.

Q. What’s the difference between TSPA and TSF?

TSPA is a 501(c)(6) membership-based organization for professionals who develop and enforce principles and policies that define acceptable behavior and content online. Think ABA for lawyers, or IAPP for privacy people, but for those working in trust and safety, who can use TSPA to connect with a network of peers, find resources for career development, and exchange best practices.

TSF is a fiscally sponsored project of the Internet Education Foundation and focuses on research.

The two organizations are complementary, but have distinct missions and serve different communities. TSPA is a membership organization, while TSF has a charitable purpose.

Q. Why are you doing this now?

We first started discussing the need for something like this more than two years ago, in the wake of the first Content Moderation at Scale (COMO) conference in Santa Clara. The conference was convened by one of TSPA’s founders and board members, Santa Clara University law professor Eric Goldman, which you can read about right here. After the first COMO get-together It was clear that there was a need for more community amongst people who do trust and safety work.

Q. Are you taking positions on policy issues or lobbying?

Nope. We’re not advocating for public policy positions on behalf of corporate supporters or anyone else. We do want to help people better understand trust and safety as a field, as well as shed light on the challenges that trust and safety professionals face.

Q. Ok, so you launched. Now what?

For TSPA, we’re in the process of planning some virtual panel discussions that will happen before the end of the year on various topics related to trust and safety. Topics will range from developing wellness and resilience best practices, to operational challenges in the face of current events like the US presidential election and COVID-19. Longer term, we’re working on professional development offerings, like career advancement bootcamps and a job board.

Over at TSF, we partnered with the folks right here from Techdirt to launch with a series of case studies from the Copia Institute that illustrate challenging choices that trust and safety professionals face. We are also hosting an ongoing podcast series called Flagged for Review, with interviews from people with expertise in trust and safety.

We’re also looking for founding Executive Director, who can get TSPA and TSF off the ground. Send good candidates our way.

Q. Sounds pretty good. How do I get involved?

Sign up here so we can share more with you about TSPA and TSF in the coming months as we open our membership and develop our offerings. Follow us on Twitter, too. If you work for one of our corporate supporters, you can reach out to your trust and safety leadership as well to find out more. We’d also love to hear from organizations and people who want to help out, or whose work is complementary to our own. We’re excited to further develop and support the community of online trust and safety professionals.

Posted on Techdirt - 5 February 2018 @ 01:25pm

Putting Pinners First: How Pinterest Is Building Partnerships For Compassionate Content Moderation

Last week, Santa Clara University hosted a gathering of tech platform companies to discuss how they actually handle content moderation questions. Many of the participants in the event have written essays about the questions that were discussed at the event. Last week we published five of those essays and this week we’re continuing to publish more of them, including this one.

The way platforms develop content moderation rules can seem mysterious or arbitrary. At first glance, the result of this seemingly inscrutable process is varying guidelines across different platforms, with only a vague hint of an industry standard — what might be banned on one platform seems to be allowed on another. While each platform may have nuances in the way they create meaningful content moderation rules, these teams generally seek to align with the platform’s/company’s purpose, and use policies and guidelines to support an overarching mission. Different platforms delivering unique value propositions to users’ accounts for variations in content moderation approaches.

At Pinterest, our purpose is clear: we help people discover and do what they love by showing them ideas that are relevant, interesting, and personal. For people to feel confident and encouraged to explore new possibilities, or try new things on Pinterest, it’s important that the Pinterest platform continues to prioritize an environment of safety and security. To accomplish that, a team of content policy professionals, skilled in collaborating across different technical and non-technical functions at the company, decide where we draw the lines on what we consider acceptable boundaries for content and behavior. Drawing upon the feedback of Pinterest users, and staying up to date on prevailing discourse about online content moderation, this team of dedicated content generalists brings diverse perspectives to bear upon the guidelines and processes that keep divisive, disturbing, or unsafe content off Pinterest.

We know how impactful Pinterest can be in helping people make decisions in their daily life, like what to eat or what to wear, because we hear directly from the Pinterest community. We’ve also heard how people use Pinterest to find resources to process illness or trauma they may have experienced. Sometimes, the content that people share during these difficult moments can be polarizing or triggering to others, and we have to strike the right balance of letting people rely on Pinterest as a tool for navigating these difficult issues, and living up to our goal of removing divisive, disturbing, or unsafe content. As a team, we have to consider the broad range of use cases for content on Pinterest. For example, important historical yet graphic images of war can be collected in the context of learning about world events, or to glorify violence. Our team takes different contextual signals into account during the review process in order to make meaningful content moderation choices that ensure a positive experience for our community. If we wish to have the impact we hope to have in people’s lives, we must also take responsibility for their entire experience.

To be responsible for the online environment that our community experiences, and to be aware of how that experience connects in a concrete way to their life offline, means we cultivate the humility to realize our team’s ?limitations. We can’t claim to be experts in fields like grief counseling, eating disorder treatment, or suicide prevention — areas that many groups and individuals have dedicated their careers to supporting — so it’s crucial that we partner with experts for the guidance, specialized skills, and knowledge that will enable us to better serve our community with respect, sensitivity, and compassion.

A couple years ago, we began reexamining our approach to one particularly difficult issue – eating disorders – to understand the way our image-heavy platform might contribute to perpetuating unhealthy stereotypes about the ideal body. We had already developed strict rules about content promoting self-harm, but wanted to ensure we were being thoughtful about content offering “thinspiration” or unhealthy diets from all over the internet. To help us navigate this complicated issue, we sought out the expertise of the National Eating Disorder Association (NEDA) to audit our approach, and understand all of the ways we might engage with people using the platform in this way.

Prior to reaching out to NEDA, we put together a list of search queries and descriptive keyword terms that we believed strongly signaled a worrying interest in self-harm behaviors. We limit the search results we show when people seek out content using these queries, and also use these terms as a guide for Pinterest’s operational teams to decide if any given piece of self-harm-related content should be removed or hidden from public areas of the service. The subject matter experts at NEDA generously agreed to review our list to see if our bar for problematic terms was consistent with their expert knowledge, and they provided us with the feedback we needed to ensure we were aligned. We were relieved to hear that our list was fairly comprehensive, and that our struggle with grey area queries and terms was not unique. Since beginning that partnership with NEDA, they have developed a rich Pinterest profile to inspire people by sharing stories of recovery, content about body positivity, and tips for self-care and illness management. By maintaining a dialogue with NEDA, the Pinterest team has continued to consider and operationalize innovative features to facilitate possible early intervention on the platform. For example, we provide people seeking eating disorder content with an advisory that also links to specialized resources on NEDA’s website, and supported their campaign for National Eating Disorder Awareness Week. Through another partnership and technical integration with Koko, a third party service that provides platforms with automated and peer-to-peer chat support for people in crisis, we’re also able to provide people who may be engaging in self-harm behaviors with direct, in-the-moment crisis prevention.

Maintaining a safe and secure environment in which people can feel confident to try new things requires a multifaceted approach and multifaceted perspectives. Our team is well-equipped to grapple with broad online safety and content moderation issues, but we have to recognize when we might lack in-house expertise in more complex areas that require additional knowledge and sensitivity. We have much more work to do, but these types of partnerships help us adapt and grow as we continue to support people using Pinterest to discover and do the things they love.

Adelin Cai runs the Policy Team at Pinterest

More posts from adelin.cai >>