How an AI moderator aims to eliminate toxicity and cheating in online multiplayer gaming

Equipped with Minerva AI technology, FACEIT ensures that gamers are protected from cheats. (Screen grab from FACEIT website)
Short Url
  • Minerva is a specialized AI technology that moderates text, audio and behavioral data to detect toxicity on FACEIT’s gaming platform
  • Players caught harassing, cheating, or undermining account integrity can be temporarily banned or face a multi-year suspension

RIYADH: As the world’s leading online platform for competitive gaming, FACEIT is leveraging the power of artificial intelligence to ensure a safe environment for its global — and growing — 25 million user community.

The FACEIT platform uses Minerva, a specialized AI technology that can understand in-game actions and other non-text chat behaviors, identifying trends that suggest poor sportsmanship beyond explicit statements.

Minerva has documented more than 4 billion messages on the esports platform and has implemented more than 5 million corrective actions to improve player interactions and police bad behaviors.

FACEIT is the digital platform offering of ESL FACEIT Group, a gaming and esports company procured for $1.5 billion in 2022 by Ƶ’s Savvy Games Group, which is 100 percent owned by the country’s Public Investment Fund.

The industry as a whole is already immensely profitable. In 2023, the global online gaming market generated approximately $26.14 billion in revenue, which translates to 9.8 percent growth compared to the previous year, according to Statista.

Ƶ is considered a key market. A report by the US-Saudi Business Council found that more than 68 percent of young Ƶ citizens and 58 percent of the population as a whole self-identify as gamers

Opinion

This section contains relevant reference points, placed in (Opinion field)

According to Maria Laura “Lulu” Scuri, vice president of labs and community integrity at ESL FACEIT Group, more than 80 percent of gamers have reported experiencing harassment in a multiplayer game, while 28 percent stopped playing their favorite games because of toxic behaviors.

“Toxicity and harassment take many shapes and forms, ranging from in-game actions (griefing and sabotaging teammates) to verbal and text abuse (mic spam, insults and cursing) to targeted attacks based on a player’s identity (sexism, racism and more),” Scuri told Arab News.

“Negative in-game interactions make it more difficult for individuals to enjoy their play time, forge meaningful relationships with others, and connect with a community that, overall, tends to provide a positive experience.”




Maria Laura “Lulu” Scuri.

Scuri says AI tools like Minerva help human moderators make quicker and better decisions to fight toxicity at a scale that would not otherwise be possible.

“These systems don’t only protect users but encourage positive play, be it by acknowledging players’ impact on improving their community and the FACEIT platform.”

Scuri says the system is “almost human” in its judgment and performance “thanks to the wealth of data Minerva has analyzed.”




To ensure a safe environment, Minerva provides anti-cheat and chat moderation. (Screen grab from FACEIT platform)

“The AI has a human-like understanding of interpersonal interactions. For example, not every curse word or piece of slang is malicious,” she said.

“Instead, Minerva looks for patterns in behavior and the full context of text and voice messages to determine if behavior is worthy of a flag. As a whole, this work allows FACEIT to not only efficiently identify bad behavior, but do so at a scale that meaningfully shapes how players experience their favorite games.”

Popular multiplayer first-person shooter game “Counter-Strike 2” was released on the FACEIT platform in September last year, allowing players to join communities. In addition, they can join or host matches on private servers, participate in community tournaments, or qualify for the FACEIT Pro League.

DID YOUKNOW?

• FACEIT is an esports platform founded in 2012 that administers leagues for games including ‘Counter-Strike 2,’ ‘League of Legends,’ ‘Rocket League,’ and ‘Rainbow Six Siege.’

• In 2022, FACEIT and esports company ESL were acquired by Savvy Games Group, a holding company owned by Ƶ’s Public Investment Fund, for a combined $1.5 billion.

• FACEIT’s Minerva engine is a specialized AI technology that analyzes and moderates text, audio and the behavioral data of players to detect toxicity and other abuses.

To ensure a safe environment, Minerva provides anti-cheat and chat moderation.

“Each game title and community is different, and moderation needs to reflect that,” said Scuri. “Whether it be adjusting to the ways players communicate with each other — text, voice and more — or the in-game actions that correlate with bad behavior, context is key.

“Instead of just punishing bad behavior, FACEIT is taking steps to reward positive play, encouraging the players who make a strong, positive impact in-game to continue to set an example for their community.”




(Screen grab from FACEIT platform)

There are, however, several punishments that Minerva can dish out if users act out.

Players who violate FACEIT’s code of conduct may be temporarily banned, face multi-year suspension from participating in games or accessing their accounts, or receive a warning. Meanwhile, “cooldowns” are time-based restrictions placed on accounts for smaller infractions.

Ban lengths vary based on the severity of the offense and the number of times a user has repeated the behavior. These offenses fall into three main categories: toxicity, subversion of account integrity, and cheating.

INNUMBERS

$26.14 billion Global online gaming revenue in 2023. Source: Statista

$32.56 billion Projected global online gaming revenue in 2027.

1.13 billion Number of online gamers worldwide.

Toxicity includes acts of harassment, encouraging self-harm, spamming, posting offensive content, griefing, ghosting, blocking, team flashing or intentional team damage, abuse of the platform’s reporting system, and abuse of its live admins.

Violations of account integrity can include account sharing, ban evasion, boosting or ladder abuse, multi-accounting, and smurfing or intentional de-ranking.

If a player is caught cheating, they can be banned for two years. Any user caught evading a cheating ban on a new account will have it permanently banned and deleted. The cheating ban of the original account will also be extended for another two years.

It is hoped that the automatic detection of such violations by Minerva will make competitive gaming much fairer, match players more effectively, and ensure the online environment is both safe and enjoyable.