According to Decrypt, video game giant Activision has shared the initial results of its AI moderation tool, ToxMod, implemented in Call of Duty since August last year. The tool has led to over 2 million accounts receiving in-game enforcement for disruptive voice chat. ToxMod can identify disruptive comments across 14 languages in Call of Duty: Modern Warfare II, Modern Warfare III, and Call of Duty: Warzone. Activision reported a month-over-month reduction in repeat offenders since the AI model's introduction, with an 8% reduction in repeat offenders and a 50% reduction in severe instances of disruptive voice chat since Modern Warfare III's launch.
The company emphasized the importance of active reporting by players and is exploring ways for players to provide additional feedback. Activision also updated the Call of Duty Code of Conduct, reiterating its zero-tolerance policy for bullying, harassment, and derogatory remarks related to race, gender identity or expression, sexual orientation, age, culture, faith, mental or physical abilities, or country of origin. The company is committed to combating toxicity within its games and will continue to evolve its moderation technology to ensure a fair and fun gaming experience for all players.