Activision Rolls Out Beta Version Of Its Call Of Duty Voice Chat Moderation Tool

call of duty warzone 2.0 ghost character art

Activision is stepping up its anti-toxicity measures in Call of Duty games with a new voice chat moderation system. The publisher touts the new tool as “the next leap forward” in its ongoing battle to combat toxicity and disruptive behavior in the multiplayer shooter series.

It’s basically a version of Modulate’s ToxMod that’s been tweaked to work with Call of Duty games. For those unfamiliar with ToxMod, it’s an AI-powered moderation tool that screens voice chat for hate speech, harassment, or other discriminatory language and was refined with the help of the Anti-Defamation League.

“This new development will bolster the ongoing moderation systems led by the Call of Duty anti-toxicity team, which includes text-based filtering across 14 languages for in-game text (chat and usernames) as well as a robust in-game player reporting system,” Activision said.

The voice chat moderation tool will be deployed alongside Call of Duty: Modern Warfare 3 on November 10th. A beta version of the tool went live in Call of Duty: Modern Warfare 2 and Warzone 2.0 earlier this week.

We do note that there’s no way to opt-out of the moderation tool except by disabling voice chat. Don’t worry though, you’ll still be able to engage in the age-old tradition of trash talking and friendly banter as revealed in this FAQ.