Riot Games to record Valorant Voice chats to combat toxic behavior

As an effort to tackle the abusive behavior of toxic players, Riot Games is updating its privacy notice next month, that it'll be using new tools to scale back toxicity, the primary game to use these abilities, which can record in-game voice chat when a report is filed, is Valorant. Well, it will raise some privacy and security questions, which is very likely to in my opinion, and the people who will not agree with this decision and care about their privacy can just switch to voice chat platforms like discord, overtone, or other such applications.

The audio data will be stored to be used when players report those engaging in abusive or disruptive behavior. The info is then checked if anything violates Riot Games' Terms of Service or other policies. It'll be made available to the violating player if any violation is found and deleted after it's not needed. 

Data also will be deleted if the recording contains no disruptive behavior. An identical system is in situ on PlayStation, though there's requires players to record a little section of a talk to send it for a moderation review.

Riot said that the new voice moderation tools don't involve actively taking note of live in-game audio. The studio said that it'll only hear audio once a report has been filed. For now, the system is going to be beta tested in North America before expanding elsewhere around the world. 

"We're committed to creating our games better for everybody who plays them," Riot said. "This is another step toward tackling disruptive behavior across the board, starting with Valorant." These changes will be applied to all the Riot Games, TechCrunch reports, meaning all players across all the studio's games--including League of Legends and Legends of Runeterra--have to simply accept them, there is no other way around that updated privacy policy next month, either the players accepts it or quit the game. It's unclear when other Riot Games titles will get these updates.

They didn't specify how these new voice moderation tools will work. Consistent with the head of players dynamics West Hart, the technology required to detect behavior violation over voice chat remains in development. Hart hinted that it's going to rest on machine learning or specialize in automated voice-to-text transcription. Hart also made explicit regard to the "pain in voice comms" that spurred Riot to consider an answer to tackle abusive or disruptive behavior while gaming online.

"Players are experiencing a lot of pain in voice comms in which pain takes the shape of all types of various disruption in behavior, and it is often pretty harmful," Hart said. "We recognize that and that we have made a promise to players that we'll do everything that we could during this space."

Well, this is just in the beta testing, so it may or may not be released. As in my personal opinion, I don't think it's the right step, it is better if players just mute the other player who is abusing and report them, as I see this as a privacy problem. And not to forget Riot Games is owned by Tencent.

Comments