Safety Snapshot: January
Proactive Avatar Moderation and Civility Research
This month’s Safety Snapshot highlights the safety and civility initiatives we’ve launched over the last few weeks, including new technology to proactively monitor avatars and educate users about our policies. We’ve also created and launched new civility reports in partnership with third-party organizations.
Announcing Proactive Avatar Moderation
In 2025, Roblox users updated their avatars an average of 274 million times a day, changing their avatar’s clothes and hairstyles along with adding pets, hats, and other accessories. Many users spend time carefully curating their avatar’s look because it’s an expression of themselves on the platform. To help keep people safe, we proactively scan everything uploaded to the Avatar Marketplace and block anything we detect as violating our Community Standards. However, users sometimes combine innocuous items in a way that creates an outfit that violates our policies. For example, a black suit, armband, and mustache are all compliant on their own, but combined, they could be used to create an avatar that resembles Adolf Hitler, which goes against our policies.
We’ve historically identified problematic avatars through user reports. Beginning this month, we will automatically scan avatars for violations every time they’re changed—often before other users see it. These scans evaluate the complete avatar, with all equipped items, to detect whether anything violates our standards. If the system detects anything inappropriate, the user’s avatar will be reset to a default avatar (with all avatar items unequipped). A notification will inform the user that their avatar has been reset and which policy was violated, to help them learn and abide by our platform’s guidelines in the future. While no system is perfect, these notifications provide users with a way to flag if they feel the system made an error and help us improve moderation accuracy.
Supporting Civility Research
To further advance research for the gaming industry, Roblox collaborated with the Prosocial Design Network, an organization dedicated to promoting innovative design elements that foster and encourage positive behavior on online platforms. The goal of this collaboration was to identify research projects that may help promote positive social connection on gaming platforms. Proposals have been reviewed and we’re happy to share that six research teams have been awarded grants to pursue projects related to fostering connection through gaming. As these research projects proceed, we hope to see what’s learned and if any of these might inform future product design for Roblox as well as other companies. More details on the winning projects can be found here.
Report: Designing for Positive Social Connections
In collaboration with the Young and Resilient Research Centre at Western Sydney University and PROJECT ROCKIT, we’ve recently released a report on Building Connection. The report provides a framework for policymakers and leaders in the tech industry to intentionally and proactively design for children’s well-being, autonomy, and healthy social connection. These evidence-based recommendations include technical innovations and regulatory standards to help the industry come together to create digital environments that empower young people to thrive in safe, connected environments.
The recommendations in this report reinforce the investments Roblox has made over the years and many align with how we think about designing for safety, including:
-
Multilayered safety system: Over the years, we’ve designed a layered safety system that has built-in redundancies and continuous improvement to help balance innovation with user safety.
-
Partnering with experts: We leverage a combination of sophisticated safety technology and real-world collaboration, working with everyone—parents, safety experts, and government agencies—to keep our community safe and civil.
-
Teen and parent councils: These councils bring both young people and their parents in to consult on some of our policies and features and provide insight into language to ensure that the content we share with younger users feels relevant and respectful.
-
Real-time voice moderation: This system triggers notifications to let users know when they’ve violated a policy, educating them on our policies and implementing chat time-outs for subsequent violations.
-
Proactive avatar moderation: As noted above, this new system will also include information to help educate users about why their avatar cannot be published and which policy their avatar or update violates.
“Our collaboration with Roblox underscores the importance of evidence‑informed design to create digital environments that prioritize young people’s wellbeing.” said Professor Amanda Third, of the Young and Resilient Research Centre at Western Sydney University. “This important piece of work highlights the value of bringing researchers, young people, and industry together to deepen our understanding and to design spaces that nurture connection, empathy, inclusion, and positive social engagement. Cross‑sector collaboration is essential if we want to build online environments that uphold children’s rights and meaningfully support prosocial behaviours, with young people’s voices at the center of that process.”
All this work is part of our ongoing outreach and collaboration with youth organizations, including PROJECT ROCKIT, to support their work in preventing online bullying and promoting healthy, positive social connections online.