Serving tech enthusiasts for over 25 years.
TechSpot means tech analysis and advice you can trust.
In brief: Much speculation has arisen over the possible benefits that technologies like generative AI and machine learning might bring to video games. While image reconstruction has proven useful and generative AI has reportedly entered the production process, AI-based analysis has begun to impact voice chat moderation in competitive gaming spaces.
Activision Blizzard reports that exposure to toxic voice chat in Call of Duty has declined by 43 percent since the beginning of this year. The publisher credits the recent implementation of AI-based moderation for the results, which have convinced it to expand its use when Call of Duty: Black Ops 6 launches on October 25.
The publisher introduced the moderator, ToxMod, when it launched Call of Duty: Modern Warfare III last November. According to ToxMod's website, the system analyzes text transcripts of in-game voice chat and detects keywords based on multiple factors.
To tell the difference between trash talk and harassment, ToxMod monitors keywords and reactions to comments, recognizing player emotions and understanding the game's code of conduct. Furthermore, the system estimates the perceived age and gender of the speaker and recipients to understand each conversation's context better.
ToxMod can't ban anyone on its own, but it can quickly flag violations for review by human moderators. Activision then decides whether to warn or mute players, only issuing bans after repeated infractions. The number of repeat offenders in Modern Warfare III and Call of Duty: Warzone fell by 67 percent since Activision implemented improvements in June 2024. In July alone, 80 percent of players caught violating voice chat rules didn't re-offend.
All regions except Asia currently support ToxMod, and Call of Duty uses the system to moderate voice chat in English, Spanish, and Portuguese. When Black Ops 6 launches, Activision will add support for French and German.
Meanwhile, text-based chat and username moderation expanded from 14 to 20 languages in August. Community Sift has been Call of Duty's text moderation partner since the first Modern Warfare reboot game launched in 2019, blocking 45 million messages since Modern Warfare III's November release.
Using AI to moderate player behavior is far less controversial than employing the technology for in-game assets. Late last year, AI-generated art appeared in content skins for Modern Warfare III. Amid the historic number of gaming industry layoffs occurring around that time, some fear publishers will try to replace artists with AI models.