August 3, 2023
The 2023-24 football season is about to kick off and with it comes the excitement for fans of knowing that anything could happen.
The new season will ignite a flurry of activity on the social media pages of football teams around the world, as fans surge online to talk tactics, transfers and team news. But both top-flight teams and those in the lower leagues will be facing a battle on and off the pitch this season: online toxicity.
Whether it’s racist abuse aimed at players or a flood of links to fake streaming sites, toxic content can seriously damage a club’s image, reputation and the online experience for fans.
For all kinds of brands and businesses, the line between the offline and online world is increasingly blurred. Football is no exception. Today, a team’s social media pages are the online equivalent to their stadium or club shop; packed with fans showing their support or criticism, and voicing their opinions. It has become a club’s ‘store front’, and what happens there is a direct reflection of a club’s values.
Unfortunately, football teams’ social media pages are also full of toxic messages. Racist abuse, homophobic comments, threats and undesirable content like malicious links and cryptocurrency scams can all be found on the pages of some of the world’s biggest clubs.
Scammers are also increasingly infiltrating top-tier teams’ social media to offer fake tickets to the most sought-after games. With many matches selling out almost instantly, desperate fans are duped into parting with hundreds or even thousands of pounds to try and secure a ticket to high-profile games. Once the money has been transferred, the scammer disappears and is usually untraceable.
A leading UK bank found that, during the 2022-2023 season, victims of ticket scams for clubs in the Premier League lost an average of £154 each. The same bank found that, amongst its own customers, the number of people being scammed when buying football tickets had risen by 101% compared to the previous season. More than 90% of the football ticket scams originated on just three social media platforms: Twitter, Facebook and Instagram. 18-24 year-olds were most likely to fall victim to a ticketing scam.
Bad links to fake streaming sites are also a problem, especially as the rising cost of living tempts people to try and watch their favourite team for free. Unassuming fans click on links believing they’ll be taken to a website where they can watch games without paying. Instead, they’re directed to fake pages created to infect the user’s computer with malware through clicking on downloads or pop-ups.
A new report from security experts Avast found a 39% increase in this kind of online streaming attack in the UK in 2023, whilst in France they increased by 93%. The data showed a 45% increase on average in URL-based attacks globally during football matches, and the same percentage increase during Champions League games.
Visit Manchester United’s club shop on match day and you probably won't find unscrupulous scammers selling cryptocurrency or counterfeit tickets. Similarly, anyone caught shouting racist abuse would be quickly removed from the stadium (and banned!). So why does this kind of behaviour remain on the social media pages of some clubs?
In an ideal world, toxic content wouldn’t exist. But as long as a club has a social media presence, these kinds of messages will happen. The sheer volume of comments that a sports team can attract on a single post is too much for a human social media moderator to manage; and many clubs still haven’t recognised how intrinsically linked their social media pages are to their image, reputation, and online fan experience.
It can also be difficult for human moderators to identify whether an interaction is innocent or harmful, particularly if it contains emojis, or has covertly racist undertones, for example.
For these reasons, automated content moderation makes total sense for football clubs and other sports teams trying to get a grip of toxicity in their online communities.
Bodyguard protects the online communities of some of the world’s biggest football clubs. Our smart, automated moderation solution uses machine learning and Natural Language Processing to identify toxic content with superior accuracy and remove comments before they can do harm.
We remove up to 95% of online toxic content whilst putting the power to decide what will and won’t be tolerated in the hands of the club.
In the 2022-2023 football season alone, Bodyguard moderated more than 25m messages across our three biggest football club partners, with an average of 18% of 'undesirable' messages identified as scam content.
Our deep knowledge and understanding of the sports industry paired with our cutting-edge AI technology makes our content moderation solution ideal for sports clubs of every kind that want to create a safer, more positive and authentic online space for fan engagement.
By doing so, sports teams can be confident they are protecting their reputation and revenue, and stopping fans falling victim to scams in their online communities.
Ready to speak with the team? Contact us today.
© 2024 Bodyguard.ai — All rights reserved worldwide.