- Tech Industry Collaboration: Google, OpenAI, Roblox, and Discord launch ROOST, a non-profit focused on child safety through AI-driven content moderation tools.
- AI-Powered Protection: The initiative aims to unify and enhance existing CSAM detection technologies, making them more accessible for companies to implement.
- Regulatory Pressure & Funding: With child exploitation cases rising, ROOST has secured $27 million in funding to support its first four years, backed by major philanthropic organizations.
In a major move to enhance child safety online, Google, OpenAI, Roblox, and Discord have launched a non-profit organization called Robust Open Online Safety Tools (ROOST). This initiative aims to provide free, open-source AI tools that help identify, review, and report child sexual abuse material (CSAM). By making core safety technologies more accessible, ROOST seeks to assist companies in improving their content moderation efforts and protecting young users.
The formation of ROOST comes as advancements in generative AI continue to reshape digital environments, raising new concerns about online child safety. The initiative plans to unify existing detection technologies while leveraging AI-driven solutions to improve content moderation. While details on specific tools remain limited, ROOST emphasizes its commitment to fostering innovation by making safety infrastructure more transparent and inclusive.
The launch of ROOST coincides with growing regulatory scrutiny over child safety measures on social media and gaming platforms. Companies are under increasing pressure to self-regulate as lawmakers push for stricter protections. Reports from the National Center for Missing and Exploited Children indicate a 12% rise in suspected child exploitation cases between 2022 and 2023, highlighting the urgency of the issue. Roblox, in particular, has faced criticism for failing to prevent child exploitation, with its platform widely used by children.
ROOST’s founding members are contributing funding, expertise, and existing safety technologies to the project. The initiative aims to establish a “community of practice” by collaborating with AI developers, providing vetted training datasets, and addressing gaps in current safety measures. Plans include making existing tools more accessible and integrating AI moderation systems through API-based solutions for broader adoption.
Alongside ROOST, Discord has introduced a new “Ignore” feature that allows users to mute messages discreetly. The organization has already secured over $27 million in funding from philanthropic groups, including the McGovern Foundation and the Knight Foundation. With support from experts in AI, child safety, and digital security, ROOST aims to set a new standard for online protection, ensuring a safer internet for young users.