The startup Modulate has raised $30 million to develop its Artificial Intelligence (AI) voice moderation product ToxMod.

ToxMod leverages machine learning to scan voice chats and identify the toxic players in online games.

Through the use of AI, ToxMod highlights problems that human moderators should flag when players are chatting with one another in online games. Toxicity during interactions in online games is a major issue and will only rise as we enter the metaverse worlds such as Ready Player One and Snow Crash.

Modulate
Modulate

Modulate was able to raise such a large sum because its service is already being used by community managers in major titles such as Rec Room and Poker Stars VR to identify some of the leading toxicity problems in their experiences. Modulate CEO Mike Pappas says toxicity in online games is a “large-scale market need.”

The funding round was led by Lakestar. Also participating in the funding round were Modulate’s existing investors such as Hyperplane Ventures and Everblue Management among others. A Lakestar managing partner, Mika Salmi, will also be joining Modulate’s board.

The company’s ToxMod product provides proactive voice moderation that is built to capture different kinds of toxicity ranging from the overt such as adult language and adult speech to the more insidious forms of toxicity such as violent radicalization, child grooming, and self-harm. ToxMod’s AI has been trained through 10 million hours of audio.

Modulate says it wants to change how online toxicity is handled by game developers and the huge funding will go a long way in helping the company realize its mission, according to CEO Pappas.

Pappas says the company’s core business is proactive voice moderation that enables developers to comprehensively handle all forms of bad behavior across platforms instead of simply relying on player reports.

ToxMod leverages sophisticated machine learning models. These transcend transcription and are able to understand what a player is saying as well as how they are saying it such as the emotion, volume, prosody and much more. This is important as context matters in conversations. Something that might appear as friendly trash talk could also be harmful in another context.

The startup has said that the ToxMod product leverages a nuanced understanding of voice to tell apart the different types of situations, identify the worst actors while at the same time allowing everyone to freely enjoy their approach to every game.

The sophistication of the ToxMod AI product enables the identification of offenses with an accuracy of more than 98% and this is set to improve with time. The product also allows moderation teams to respond to incidents at a rate that is more than 25 times faster.

The product has already been successfully commercialized and is being used by some large customers.

Many companies providing online social networking services have a serious toxicity problem, with the volumes of reports often overwhelming human moderators.

Human moderators are often able to intercept only a tiny fraction of the toxicity hovering in online worlds.

Modulate says its ToxMod product aims to shine a light on what happens when it comes to toxicity by first mapping out and understanding the landscape of toxicity and how it comes about, where it is occurring as well as how players are acting to it. The startup will then explores ways of working with customers in designing educational programs.

Players have to know why they are being punished. For instance, if there is toxicity where there are allegations of cheating, then it is important that the players are notified. Modulates is also contemplating how to help with the mental health of moderators who are grappling with different kinds of abuse.

In the metaverse, it would be more sensible for game companies to try and work on these problems in the smaller context of their games before the games are connected with other applications.

In cases where voice moderation tools are focusing on just 8% of the players submitting toxicity reports, ToxMod is providing proactive moderation to empower both platform and game moderators and enable them to make informed decisions that protects players from different kinds of abuses such as toxic behavior, harassment as well as the more insidious forms of abuse such as child grooming. Modulate says it is already assisting its customers handle thousands of instances of toxicity online.

Modulate says it is ensuring its ToxMod product isn’t misclassifying mild or harmless forms of interactions such as trash talk which can be acceptable in certain contexts and games such as the mature-rated Call of Duty.

ToxMod aims to make moderators more effective across platforms and contexts. According to Modulate CEO Pappas, the model for spotting patterns has been trained over time and keeps improving. As human moderators go through the results and pick out the false positives, the system continues to learn and refine its algorithm. The system can then begin to take immediate action although Modulate concedes that its ToxMod product can also misunderstand conversations due to the complexity of human language. Every game also has different standards.

Besides, words such as “powned” must be seen within the context they are used in. Modulate says it has built a comprehensive data set that has been designed specifically for “real social voice chats online.” According to Modulate boss Pappas, this has enabled the company to build accuracy in its models that can surpass any of the major company transcription models out there by a good percentage.

Modulate says it cultivated excellent relationships with VCs because it focused its efforts on the fundamentals of running a strong business. As a result, it was also able to raise money at a time when it was not easy to do so even for the more established players such as game developers.

Modulate currently employs 27 people.

https://virtualrealitytimes.com/wp-content/uploads/2022/08/Modulate-600x461.pnghttps://virtualrealitytimes.com/wp-content/uploads/2022/08/Modulate-150x90.pngRob GrantTechnologyThe startup Modulate has raised $30 million to develop its Artificial Intelligence (AI) voice moderation product ToxMod. ToxMod leverages machine learning to scan voice chats and identify the toxic players in online games. Through the use of AI, ToxMod highlights problems that human moderators should flag when players are chatting with...VR, Oculus Rift, and Metaverse News - Cryptocurrency, Adult, Sex, Porn, XXX