January 20, 2023
Harassment in sports is a major issue, both on the field and in digital spaces. Athletes often face online threats, hate messages, discrimination, and slurs made against them on various social media platforms.
This type of toxic behaviour was recently analyzed by a report on the World Athletics Championships Oregon22. World Athletics published the findings of the study to illustrate the ongoing problem of online harassment in sports.
Sport is far from being immune to toxicity. Athletes have to face not only the physical challenge of dealing with abuse in person but must also protect themselves against online hate.
The World Athletics report on the World Athletics Championships Oregon22 found that:
Harassment in sports takes many different forms. Athletes often have to deal with e-violence. The World Athletic report found that there were complaints about:
There have long been issues with sports and racism. Athletes often also face sexism and homophobia in sports. Many athletes have reported that their mental health has suffered because of repeated online harassment. Online abuse can severely impact athletes’ self-confidence and impair their ability to perform at their best.
Effective moderation of sport-related online communities can be difficult, however. There are two main issues that human moderators face: content can be detected too late, and it can be difficult to identify.
The sheer number of posts and comments that a sporting event can generate online is incredible. Human moderators simply cannot keep up with the avalanche of activity. In most cases, toxic content is not detected in time.
In order to solve this problem, organizations and clubs must take action and use an AI moderation solution that has knowledge of sports.
It is hard for human moderators to identify offensive comments. A seemingly innocent chat or comment can have offensive racist overtones or a sexual connotation. Moderating chats where emojis or emotes are used can also be difficult.
Harassment in sports is often linked to online betting. If a person places a large bet on a team or individual sportsperson and does not get the result they wanted, then they may direct their anger and frustration at the team or athlete in question. This can result in toxic online behaviour.
Bodyguard.ai provides effective moderation solutions that can protect athletes and their fans from suffering online harassment. The smart and autonomous moderation solution employs machine learning and linguistic expertise to accurately identify and remove offensive and toxic comments before they are seen.
Cyberbullying can negatively impact an athlete’s mental and physical health, and their performance.
All athletes are affected by cyberbullying, sexism, and racism in sport of all kinds. Female athletes report more instances of cyberbullying than male athletes.
Using an AI moderation solution like Bodyguard ensures that toxic comments are removed before they appear online.
Stalking can result in serious penalties, including imprisonment.
Want to find out more about how Bodyguard can protect your club, team, league or players from harmful toxicity online? Let's chat.
© 2024 Bodyguard.ai — All rights reserved worldwide.