Search a title or topic

Over 20 million podcasts, powered by 

Player FM logo
Artwork

Content provided by TrustLab. All podcast content including episodes, graphics, and podcast descriptions are uploaded and provided directly by TrustLab or their podcast platform partner. If you believe someone is using your copyrighted work without your permission, you can follow the process outlined here https://podcastplayer.com/legal.
Player FM - Podcast App
Go offline with the Player FM app!

Cathryn Weems on Content Moderation and Leveraging Transparency Reporting to Build Trust

44:15
 
Share
 

Manage episode 460011406 series 3550381
Content provided by TrustLab. All podcast content including episodes, graphics, and podcast descriptions are uploaded and provided directly by TrustLab or their podcast platform partner. If you believe someone is using your copyrighted work without your permission, you can follow the process outlined here https://podcastplayer.com/legal.

How do tech platforms develop clear content policies while balancing user freedom, regulatory requirements, and cultural contexts? What does it take to scale trust and safety efforts for billions of users in a rapidly changing digital landscape? Navigating these challenges requires foresight, transparency, and a deep understanding of user behavior.

In today’s episode of Click to Trust, we are joined by Cathryn Weems, Head of Content Policy at Character.AI, to take on the intricacies of building Trust and Safety policies. Cathryn shares her extensive experience shaping content policies at some of the world’s largest tech platforms, from crafting transparency reports to addressing complex government takedown requests. She offers unique insights into balancing global scalability with localized approaches and why clear, enforceable transparency reports are key to fostering trust.

In this episode, you’ll learn:

  1. The Art of Content Policy: Cathryn explains the challenges of defining “gray area” content and how tech platforms can develop policies that are clear and enforceable at scale.
  2. Transparency in Action: Gain insights into the evolution of transparency reporting and how transparency reports build trust with users while navigating government regulations.
  3. Women in Tech Leadership: Cathryn shares advice for aspiring women leaders in Trust and Safety, including strategies for negotiating compensation and carving a path for yourself in a male-dominated field.

Jump into the conversation:
(00:00) Meet Cathryn Weems
(01:10) The evolution of Trust & Safety as a career path
(05:30) Tackling the complexities of content moderation at scale
(10:15) Crafting content policies for gray areas and new challenges
(14:40) Transparency reporting: Building trust through accountability
(20:05) Addressing government takedown requests and censorship concerns
(25:25) Balancing cultural context and global scalability in policy enforcement
(30:10) The impact of AI on content moderation and policy enforcement
(35:45) Cathryn’s journey as a female leader in Trust & Safety
(40:30) Fostering trust and improving safety on digital platforms

  continue reading

18 episodes

Artwork
iconShare
 
Manage episode 460011406 series 3550381
Content provided by TrustLab. All podcast content including episodes, graphics, and podcast descriptions are uploaded and provided directly by TrustLab or their podcast platform partner. If you believe someone is using your copyrighted work without your permission, you can follow the process outlined here https://podcastplayer.com/legal.

How do tech platforms develop clear content policies while balancing user freedom, regulatory requirements, and cultural contexts? What does it take to scale trust and safety efforts for billions of users in a rapidly changing digital landscape? Navigating these challenges requires foresight, transparency, and a deep understanding of user behavior.

In today’s episode of Click to Trust, we are joined by Cathryn Weems, Head of Content Policy at Character.AI, to take on the intricacies of building Trust and Safety policies. Cathryn shares her extensive experience shaping content policies at some of the world’s largest tech platforms, from crafting transparency reports to addressing complex government takedown requests. She offers unique insights into balancing global scalability with localized approaches and why clear, enforceable transparency reports are key to fostering trust.

In this episode, you’ll learn:

  1. The Art of Content Policy: Cathryn explains the challenges of defining “gray area” content and how tech platforms can develop policies that are clear and enforceable at scale.
  2. Transparency in Action: Gain insights into the evolution of transparency reporting and how transparency reports build trust with users while navigating government regulations.
  3. Women in Tech Leadership: Cathryn shares advice for aspiring women leaders in Trust and Safety, including strategies for negotiating compensation and carving a path for yourself in a male-dominated field.

Jump into the conversation:
(00:00) Meet Cathryn Weems
(01:10) The evolution of Trust & Safety as a career path
(05:30) Tackling the complexities of content moderation at scale
(10:15) Crafting content policies for gray areas and new challenges
(14:40) Transparency reporting: Building trust through accountability
(20:05) Addressing government takedown requests and censorship concerns
(25:25) Balancing cultural context and global scalability in policy enforcement
(30:10) The impact of AI on content moderation and policy enforcement
(35:45) Cathryn’s journey as a female leader in Trust & Safety
(40:30) Fostering trust and improving safety on digital platforms

  continue reading

18 episodes

All episodes

×
 
Loading …

Welcome to Player FM!

Player FM is scanning the web for high-quality podcasts for you to enjoy right now. It's the best podcast app and works on Android, iPhone, and the web. Signup to sync subscriptions across devices.

 

Copyright 2025 | Privacy Policy | Terms of Service | | Copyright
Listen to this show while you explore
Play