On Wednesday, Activision announced that it will be introducing real-time AI-powered voice chat moderation in the upcoming November 10 release of Call of Duty: Modern Warfare III. The company is partnering with Modulate to implement this feature, using technology called ToxMod to identify and take action against hate speech, bullying, harassment, and discrimination.
While the industry-wide challenge of toxic online behavior isn’t unique to Call of Duty, Activision says the scale of the problem has been heightened due to the franchise’s massive player base. So it’s turning to machine-learning technology to help automate the solution.
ToxMod is an AI-powered voice moderation system designed to identify and act against what Activision calls “harmful language” that violates the game’s code of conduct. The aim is to supplement Call of Duty’s existing anti-toxicity measures, which include text filtering in 14 languages and an in-game player-reporting system.