Call of Duty: Modern Warfare 3 to Combat Toxicity with AI
Activision Introduces ‘ToxMod’ to Filter Toxic Voice Chats in Upcoming Release
In a significant move towards fostering a healthier gaming environment, Activision, the creator of Call of Duty, has unveiled plans to utilize artificial intelligence for voice chat moderation in its highly anticipated title – Call of Duty: Modern Warfare III, scheduled for release on November 10 this year.
‘ToxMod’: The AI-Powered Solution
Detecting Toxicity in Voice Chats
Activision has partnered with Modulate to craft a powerful voice chat moderation tool known as ‘ToxMod.’ This innovative tool harnesses the capabilities of machine learning to identify and combat various forms of toxicity prevalent in online gaming, including:
Hate Speech: Swift detection of offensive language and hate-driven conversations.
Harassment: Effective measures against in-game harassment.
Bullying: Ensuring a safe gaming space by countering bullying.
Sexism: Recognizing and addressing sexist remarks.
Discriminatory Language: Identifying and addressing discriminatory language.
An Arsenal Against Toxicity
Reinforcing Anti-Toxicity Measures
‘ToxMod’ will complement Call of Duty’s existing anti-toxicity arsenal, which already incorporates text-based filtering in 14 languages for both in-game chat and an efficient reporting system.
Stay tuned for a more inclusive and respectful gaming experience, as Activision takes a bold step towards curbing toxicity in Call of Duty: Modern Warfare III.
FAQs About AI to Filter Toxic Voice Chat
How does ‘ToxMod’ work in identifying toxic voice chats?
‘ToxMod’ utilizes machine learning to recognize and flag toxic elements, including hate speech, harassment, bullying, sexism, and discriminatory language.
What are Call of Duty’s existing anti-toxicity measures?
Apart from ‘ToxMod,’ Call of Duty employs text-based filtering in 14 languages for in-game chat and a robust reporting system.