How toxicity works in gaming and how to combat it

Players have the opportunity to enjoy a video game experience with other enthusiasts, in a friendly environment that can be both competitive and collaborative. However, without effective moderation, a few disturbing elements are unfortunately enough to ruin the in-game experience of an entire community.

Arnaud

Arnaud

Online abusive behavior with very real consequences

Online toxicity takes various forms: abusive language, insults, racism, sexism, and mocking of the skills of other players. It also sometimes manifests itself in moral and sexual harassment and the consequences of these inappropriate actions are immediate to players behind their screens. Such remarks will likely hurt the gamers they are aimed at. They may decide to leave the game to protect themselves against the psychological damage caused by the attacks.

The problem of toxicity can soon snowball if left unchecked. Other players, even though not the targets of the comments, will also choose to leave to avoid such behavior. The video game concerned gets a bad reputation. The company is singled out, risks damage to its brand image, and pollutes the ecosystem of its game. It eventually ends up missing opportunities with brands.

From tough competition to toxicity: the broken windows theory

Video game publishers are increasingly taking the issue of online toxic behavior seriously. Assessing the degree of toxicity of a message, proven or not, is complex depending on the context. Unlike other activities, it is not an exception to see players teasing each other in order to elicit various reactions. 

Mojang, the publisher of Minecraft, tried to solve this problem by tackling the moderation of private servers. This then allowed toxic messages to be reported, with gamers risking exclusion as a result. This news was received very badly by the community. The players saw it as an attack on their freedom, an abusive measure. Ironically, the developers were harassed and insulted online for several weeks following this announcement.

Similarly to Sports, the gaming industry has a very competitive environment and it’s not uncommon to see provocation between players or supporters of e-sports teams. These are intended as a bit of banter, to compare performance, to apply pressure, and to arouse the interest of supporters, with no intention of hurting anyone. The difficulty lies completely in ensuring that they remain good-natured.

The Broken Windows Theory, originally exposed in The Atlantic article and stemming from criminology, sheds an interesting light on the question. An analogy to video gaming would produce the following sequence:

  • A player makes inappropriate comments without any personal consequences, such as insulting someone on a social network. People forget about the messages as soon as they send them;

  • At this stage, the decent players (not yet “accustomed” to such behavior) report these actions to the relevant teams;

  • No action is taken, the dust settles and it happens again;

  • After the so-called “normal” players have tried to reduce online hatred at their level, they learn that being toxic or decent changes nothing. Toxicity in the community is then considered normal, and acceptable;

  • The circle is complete. Some players will throw out a few "dumb and mean" barbs from time to time, while others will increasingly embrace toxicity. There are now no limits.

This theory shows how a simple comment can lead to long-term consequences, even if made in the context of camaraderie. Each problematic comment that is not taken into consideration opens the door to other increasingly offensive comments.

Automatic in-game moderation

The Bodyguard.ai "custom API" is an intelligent moderation solution that can be implemented directly in games. The comments are moderated in-game in real-time. Comments that are contrary to the moderation rules implemented by the developers (on misogyny, gender slurs, a person's sexual orientation, etc.) are moderated, and the desired atmosphere within the community is preserved.

Video games are, above all, a platform for interaction that brings people together based on a common passion. They have provided a vital method of maintaining social contact during the COVID pandemic. Overall, playing video games is a positive experience for players but it can be damaged when toxicity joined the party: 

  • 43% of gamers play online with other people;

  • 33% say they have made friends through gaming;

  • 30% feel part of a community. This is considered benevolent and welcoming for 89% of them.

Interactions are an essential part of the player experience and are even essential in certain genres. Publishers need to protect their community from disruptive behaviors and harassment. With an automatic moderation solution, conversations stay healthy and players are protected.

The 3 key points to remember:

  • The gaming community is particularly susceptible to online toxicity

  • Lack of moderation leads to a snowball effect with disastrous consequences for the community and the publisher

  • An automatic moderation solution helps maintain a healthy atmosphere within in- game communities