The Future of Work & AI in Society
TalkSession Code
Sess-16Day 1
11:30 - 12:00 EST
Digital platforms face an unprecedented challenge in managing billions of pieces of user-generated content, yet industry professionals frequently conflate Trust & Safety with content moderation—a fundamental misunderstanding that undermines platform safety effectiveness. Drawing from extensive hands-on experience building integrity systems at Meta, Netflix, Coinbase, and Amazon, this presentation reveals the critical distinctions between tactical content moderation and strategic Trust & Safety governance. Content moderation operates as a reactive, operational function focused on individual content review and immediate enforcement actions. Teams work through queues of reported content, making rapid decisions based on established policy guidelines. In contrast, Trust & Safety functions as a holistic discipline that encompasses proactive system design, cross-functional collaboration, and preventative architecture to address underlying causes of harmful behavior before it manifests. The presentation demonstrates how platforms that successfully integrate both approaches achieve significantly better harm prevention outcomes while optimizing operational efficiency. Strategic Trust & Safety frameworks have enabled major platform transformations, including dramatic revenue growth at companies like Coinbase and massive-scale integrity improvements across Meta's global ecosystem serving billions of users.