
In July 2024, Tech Policy Press published “Evaluating the Forces Shaping the Trust & Safety Industry,” a comprehensive analysis of the major actors and influencers in Trust & Safety by Sujata Mukherjee and Jan Eissfeldt.
Both authors are highly respected leaders in the Trust and Safety field, bringing extensive experience and deep knowledge to their respective roles. Drawing on Mukherjee’s work at Google and Eissfeldt’s leadership at the Wikimedia Foundation, the authors emphasize community engagement and global strategy.
Their contributions have significantly shaped the practices and policies that govern online Trust and Safety today. I would urge you to read the full article, but here’s a synopsis of the main elements covered.
Evolution of Trust & Safety
The Trust & Safety field has grown significantly, influenced by three key factors:
- Proliferation of Online Services – With the expansion of user-generated content (UGC) and online platforms, the need for robust Trust and Safety measures has increased.
- Recognition of Misuse and Abuse – Companies have realized that their products can be exploited, leading to negative user experiences and damage to their reputations.
- Ongoing Adaptation – The techniques used by bad actors are constantly evolving, requiring continuous updates to Trust and Safety practices.
Current Challenges of Trust and Safety Enterprise
To understand the challenges for Trust and Safety enterprises, one must look at several areas, including:
- Scale and Complexity – Managing vast amounts of content across multiple platforms.
- Emerging Threats – New forms of online harm, like deepfakes and sophisticated scams, are constantly appearing. This requires moderating large volumes of user-generated content (UGC) to fight hate speech, misinformation campaigns, and serious risks to child safety.
- Policy Development – Creating and enforcing policies that keep pace with changing technologies and behaviors.
- Meeting Regulatory Demands – Achieving compliance with new legislation like the EU’s Digital Services Act (DSA) and the UK’s Online Safety Act is a major hurdle. This often requires publishing detailed transparency reports on content moderation activities.
Importance of a Holistic Approach
The article emphasizes the importance of a comprehensive approach that includes policy development, technological solutions, and human oversight. This approach is built on safety by design, embedding safety features and proactive risk management into products from their initial development.
It requires continuous threat intelligence and regular risk assessments to anticipate potential misuse. Collaboration between companies, governments, and civil society is essential to address the complex nature of Trust and Safety.
Call to Action
The Trust & Safety industry must continue to evolve and adapt to new challenges. Here are key actions that stakeholders should consider:
- Invest in Technology and AI Governance – When enhancing automated systems, it is important to manage the dual nature of generative AI.
- Develop Comprehensive Policies – Create clear, flexible policies that can adapt to new forms of abuse and misuse.
- Foster Collaboration – Work with other companies, governments, and organizations to share knowledge and strategies.
- Build Strong and Sustainable Operations – Invest in building skilled Trust and Safety teams with clear escalation management protocols.
- Prioritize Mental Health – Ensure that Content Moderators and Trust and Safety professionals have access to mental health resources to cope with the demands of their work.
- Promote Transparency – Maintain transparency with users about Trust & Safety measures and how decisions are made.
The Trust & Safety industry is at a critical juncture. At Zevo Health, we believe these actions will help better protect users and create safer online environments.