September 15, 2023

TikTok moderation: why it’s important and how to do it

By Lucy Wright
View all posts


TikTok first burst onto the social media scene in 2016, but it was the Covid pandemic and the first lockdown of 2020 that saw its popularity surge to dizzying new heights. 

The rise and rise of TikTok

In the first quarter of 2020, the app was downloaded 315 million times globally; the highest ever number of downloads for an app in one quarter and a 58% increase on the quarter before. 

Fast forward three and a half years and TikTok is the sixth largest social media network in the world for monthly active users, with some impressive statistics

  • 1 billion monthly active users
  • 90% of TikTok users access it on a daily basis
  • Average time spent on TikTok is 52 minutes a day (the highest of any social media platform)
  • 4 in 5 users say TikTok is ‘very’ or ‘extremely’ entertaining 

Perhaps most remarkable is the growth TikTok has achieved since its launch seven years ago. TikTok’s parent company, Bytedance, says that the platform’s global monthly users has grown 1,158% since January 2018. In the United States alone, TikTok has seen a 789% increase in users during that time.

And despite being a relatively new kid on the social media block, TikTok already has more monthly active users than X (formerly Twitter), Reddit, Pinterest and Snapchat.

With such rapid growth and an ever-increasing user base, there’s a clear case for businesses to be on TikTok. Few other social media platforms offer the same kind of reach and brand exposure. 

It’s also a proven platform for buying. Studies have found that 55% of TikTok users have made a purchase after seeing a brand or product on the platform, and 27% of users will make their purchase through the platform itself.

This tells us that businesses can benefit hugely from being on TikTok; but their account needs to be managed correctly to make sure it is an asset to their business and not a hindrance. 

Why TikTok moderation matters 

According to their 2023 trends report, one of TikTok’s force trends for this year has been “making space for joy”, and 4 in 10 users say that having their "spirits lifted" is a key motivator in whether they make a purchase or not. So, it’s clear that users are looking for positive, uplifting content that makes them feel good. 

The good news is there’s plenty of that on TikTok; but there is also a huge amount of toxicity which can be incredibly harmful to users, and if you’re a business, to your image and reputation. 

This doesn’t mean that businesses should avoid being on TikTok, but it does mean they should moderate user generated content on their page. Toxic messages, spam and scam content simply can't be left for the whole world to see. Toxicity needs to be removed as soon as possible in order to create a safe, positive and inclusive environment which reflects a brand's identity and values. 

This is even more important given TikTok’s young user demographic. The platform is especially popular with Gen Z and millennials, and almost half of all users (47%) are aged between 10 and 29. The largest user group is 10 to 19 year olds. In fact, TikTok has the youngest user base of all social media networks. 

With such a young audience consuming content, it’s essential that harmful messages and scam content are removed before they can do damage. A 2021 UK study found that 18-34 year olds are twice as likely to be a victim of a scam. So, any reputable brand that wants to protect its online community and its integrity will want to make sure their social media isn't home to toxic content.

It can be hard for human moderators to keep up with the high volume of messages generated on social platforms, and to accurately decide whether a comment is toxic or not. Moderating content using an automated solution designed specifically for moderating content on social media allows businesses to handle more user generated content, more quickly.

How to do it

Bodyguard is thrilled to be able to moderate TikTok, along with all other major social media platforms. 

Our AI solution detects and removes toxic messages in real time, so that online communities can enjoy interacting, without being exposed to harmful content which can spoil the experience. 

We let every organisation apply their own tolerance level to their online community, from permissive to very strict, so that it meets their requirements exactly. The Bodyguard dashboard also gives users visibility of all their social media pages in one place, acting as a single pane of glass for both moderation and analytics for platforms that share data. 

If you’re already moderating your social media with Bodyguard, we’d love to add your TikTok account to the mix. If you’re new to content moderation and just getting started, let’s have a chat about how we can improve the experience for your online community and enhance your brand reputation at the same time.

Contact us here.

Popular Insights

What is user-generated content and why does it matter?

By Gracia Kahungu|February 26, 2024
Read more

Bodyguard elevates content moderation with new language features

By Gracia Kahungu|December 19, 2023
Read more

TikTok moderation: why it’s important and how to do it

By Lucy Wright|September 15, 2023
Read more

New season, new rules: Why football clubs can’t tolerate online toxicity

By The Bodyguard Team |August 3, 2023
Read more
Threat MonitoringCommunity ProtectionAudience UnderstandingIntegrations
Helpful Links
SupportTechnical Documentation
CompanyCareersMedia KitContact us

© 2024 Bodyguard ai. All rights reserved worldwide.

Terms & Conditions|Privacy Policy