Top 10 actions you should never do in moderation
Moderation is tough, to say the least. Achieving consistently effective moderation is even harder. But part of getting there is understanding what not to do just as much as what to do. So, with that, let’s explore the top 10 things that good moderators must avoid.
The Bodyguard.ai team
Number 1: Over-moderating content
We’re all human, which means we all have individual biases against views, beliefs, or ways of life that we don’t agree with. This is why over-moderating content is a problem — moderators take it upon themselves to moderate content that they simply don’t like, rather than content that breaks their community guidelines. Content moderators should be guardians of free speech who are there to help others express themselves without fear of judgment, so over-moderation is a big no-no.
Number 2: Losing your temper
Being a moderator is difficult even at the best of times, and everyone has their limit. So when a hater starts throwing out extremely hurtful comments, it’s tempting to hate right back at them. But really, this only makes the moderator as bad as the hater. Chill out, stay calm and professional at all times, and deal with haters without stooping to their level. You’re better than this!
Number 3: Doing nothing
Leaving dramas doesn't help anybody — that's a fact. Action is crucial because viral content can quickly and easily hurt people. People who have trust in a community are more likely to express themselves freely. In contrast, individuals that are harassed or exposed to toxic content are most likely to engage less or even leave the community altogether.
Number 4: Ignoring feedback
Listening to feedback and incorporating it into your community is key to building long-term trust and consistent traffic. Whether it’s good or bad doesn’t matter: you should always take time to go through what users are saying and make them feel heard. Ignore it at your peril.
Number 5: Neglecting guidelines
The creation of platforms to facilitate online communication can work brilliantly, but as soon as a platform allows for user-created content to be published, moderation through the use of guidelines becomes crucial. Ignoring the development of guidelines risks creating a negative user experience and inconsistency when it comes to standards of behavior.
Number 6: Ignoring trolls
Trolls take many forms, but some are toxic to online communities. Spreading negative content, abuse or threats is one of the fastest ways to reduce a user base and damage a reputation. That’s why, when a troll rears its ugly head, it needs to be handled properly. Whatever your chosen response, be it deleting posts or revoking posting permissions, see that you respond consistently and effectively every time.
Number 7: Immediately answering every user question
This might sound counterintuitive, but answering all your community’s questions straight away can backfire. This is because it discourages users from participating and developing conversations. So, instead of jumping in immediately, hold back and see if another user can answer the question. You have to find your own balance between instantaneity and self-control (back to number 2).
Number 8: Working too hard
Yep, that’s right. It’s so easy to get burned out as a moderator, so don’t overwork yourself or commit to more than you can handle. Moderation usually comes with a heavy workload, so going too hard will drain your energy and motivation, leading to a lower quality of moderation in the long run. After all, moderators are humans, too. Check out our article on the subject here if you don’t believe us.
Number 9: Staying hidden
A great way to bring your community to life is to act as a role model. That means engaging with your community and reinforcing the kinds of behavior you want to see. Though of course moderators naturally differ in terms of visibility, not being part of a community is a missed opportunity. Remember, communication is key.
Number 10: Doing it all yourself
As mentioned above, moderation involves a lot of hard work. But you can increase the efficiency of your moderation by enabling effective (and respectful) peer-to-peer moderation. It’s a great way to engage your users, teach them your guidelines, and even find new moderators.
Alternatively, you can opt for effective automated moderation tools, like ours. Automated moderation takes a lot of the heavy lifting out of moderators’ lives and makes huge amounts of content easy to manage.
AI moderation: our solution intelligently filters out content and then shows you how it’s been classified and what action has been taken.
Cleaning phase: Bodyguard analyses text, typos, deliberately misspelled words and emojis in context.
Contextual analysis: our technology analyses to whom the toxic content is addressed.
Classification: easily organize messages into categories of severity.
Lives streaming protection: Automatically moderate millions of comments in real-time during live events.
Dashboard: get access to insightful metrics that help you foster respect and engage your community.
To find out more about our services and how automatic moderation can help you manage your community without straying into censorship, visit our site here.