In a new interview, Blizzard's Jeff Kaplan has revealed that the company is looking for ways to implement AI to track and punish toxic Overwatch players, an issue that Kaplan and his team feel strongly about. So important, in fact, that content creation has been scaled back somewhat to give devs more time to adequately address the issue. Using AI may be just the ticket.
"We've been experimenting with machine learning. We've been trying to teach our games what toxic language is, which is kinda fun. The thinking there is you don't have to wait for a report to determine that something's toxic. Our goal is to get it so you don't have to wait for a report to happen."
The overall goal is, of course, to catch toxic behavior before it is even reported and to eventually eliminate, or see a great reduction in, the player reporting system which has its drawbacks with speed of reporting, punishment meted out and in the potential abuse by some players.
Blizzard will start by 'teaching' its machines about the worst cases of toxic behavior. "With anything involving reporting and player punishments, you want to start with the most extreme cases and then find your way to ease up after that." Kaplan said.
What do you think of this solution to toxicity and its implications for other games? Is AI another tool to pursue to stem the flow? Leave us your thoughts in the comments.
You can read the full interview at Kotaku.