Zenith: The Last City is going to use a new AI-powered moderation tool to help the VR MMORPG be a safer and more enjoyable environment to play in.
The tool, called ToxMod, is from a company called Modulate, and is, according to the announcement from the Zenith team, “primarily targeting voice chat behavior that has no place in Zenith, such as racism, homophobia, transphobia, sexually explicit language and harassment. The tool is also intelligent enough to tell apart friends bantering (even with casual swearing) from toxic and aggressive voice behavior”.
After this voice chat behavior is captured, it's all sent to human moderators to review in order to take action as needed. There will be no automatic action taken, and this system is designed to supplement player reporting. Since this is a VR MMORPG, voice chat is one aspect of it, but the Ramen VR team admits that tools for moderation within the space have been limited so far. since they're looking to continually grow, even growing Zenith into a universe, it's a place to start.
In announcing the use of this tool, they cite that growth and this option as a scalable moderation tool that can work with their community. It's one reason why they referred to the tool being able to tell the difference between banter and trash talking. AI tools are in the news all the time lately, along with a number of concerns that they might bring up. In the announcement, the Zenith team specifies that this moderation tool has privacy protections in place. “Players can be comfortable knowing that their voice data is being used only to improve our moderation process and isn’t being sold to 3rd parties,” according to the announcement.
With an MMORPG in VR, it does make sense that moderation tools would need to adapt to the environment. Letting things become too difficult to play in can drive players away, so if you are looking for growth, this approach makes sense.