Community Managers: Their challenges and how to solve them

Imagine waking up at 3 a.m. to check your phone in case a new crisis needed solving. Imagine sleeping 10 hours a week to make sure you don’t lose clients. Imagine having to read hundreds of toxic comments every week and trying to not let them affect you. Imagine being a community manager.

Bastien

Bastien

Constantly being exposed to toxic content, with a high & constantly growing volume of content that needs verifying instantly, being the community gatekeeper is not for the faint-hearted.

People frequently say that Community Managers play an essential role in the strategy and the life of a business. Their job is not an easy one and the responsibility that comes with it - not small. Community managers need to nurture a company’s following, promote engagement within its communities, encourage user-generated content, and keep the community together. All that while looking out for the company’s e-reputation, striving to keep it clean.

It happened more than once that a PR crisis originated from within a business’ community, from some content that had been posted and that spiraled out of control before anyone could contain it, or from some debate between community users that quickly turned ugly. Beyond the immediate dent in the company’s image, we’re looking at other types of long-lasting impact, such as a decline in user acquisition and retention, loss of advertising revenue, lack of user engagement. Often, that could have been prevented.

Constant exposure to toxic content

Toxic content has an incredible power of destroying a community if left unaddressed. Hateful, hurtful or toxic comments should have no place on a company’s social media channels, yet we’re dealing with more toxic content than ever on social media and the Internet in general. Once given a voice, many keyboard warriors resort to the anonymity of an Internet connection to verbally attack, harass, bully, slam others. They make use of insults, threats, body-shaming, sexual harassment, racist comments, misogynist or homophobic remarks - anything goes for them.

Community managers must spend hours and hours a day manually scanning and analyzing content, in order to detect and prevent anything toxic, so that they can protect their communities.

All that constant exposure to high volumes of toxic content takes its toll on the community gatekeepers. Interviews with community managers have revealed that most of them sink under the enormity of their workload, that seems to get bigger everyday. Moreover, depression, anxiety, lack of sleep are just some of the effects these managers have to suffer in order to keep their community together.

Constantly growing volume of content

Pretending that the current way community management works is scalable and sustainable is simply refusing to see the truth. Businesses need to address this issue to allow their community managers to focus on what’s most important and have a good mental health.

By automating part of the content review process, companies could actually have a more efficient moderation system, freeing up more of their community managers’ time while also shielding them to a higher extent from all that unnecessary toxicity. Rather than having to manually go through thousands of messages every day or week, community managers could intervene only when necessary, and could spend the rest of their time thinking about new ways to engage their community.

What goes and what doesn’t?

Another challenge community managers face is having to decide what is acceptable and what isn’t when it comes to users posting content on a company’s social media channels. What can users say? What should be restricted? The responsibility that comes with this role is directly related to the potential impact on the business should things go wrong.

More often than you’d think, businesses rely on their community manager’s moral compass when it comes to deciding what is acceptable for their community and what should be removed.

This option, however, is far from ideal. Putting all that responsibility on one person’s moral compass is unfair. When things go wrong and you start blaming that person, you are basically questioning their moral compass, which shouldn’t have been the deciding factor in the first place. Additionally, if you have more than one community manager, whose moral compass will you let guide your way?

To avoid this situation, the best thing to do is to put clear community guidelines in place.

Community guidelines are a set of rules for your community members, which determine what the users can or cannot do or say in that community. Supposing an individual posts a rude or racist comment, or violates the terms in any way, there would be consequences. That user could be banned for a certain amount of time from posting any comments or content, or even removed permanently. Having this set of rules encourages members to keep your channels clean and safe for everyone.

72% of users are unlikely to return to your platform if they encounter toxic content. Every time a negative comment appears, more people are likely to leave your page. This is why community managers invest so much effort into keeping their communities clean and free of any hate. But having community guidelines, a contract all viewers agree to, would help keep your site safe by adding an additional layer of rules and giving your community managers the confidence and authority to act upon enforcing those rules.

Good community guidelines usually address two things. Firstly, they should determine what content is not allowed on your platform, what your viewers aren’t allowed to post.

The viewers of the site need to know what they are allowed to comment or do on the site, and what is considered disrespectful to the community. It’s important to distinguish different actions that should be banned from your page. For example, spamming or trolling would not be treated in the same way as racism or sexism.

Secondly, community guidelines need to establish what are the consequences when someone infringes the rules. If your website promotes inclusions, a comment about an individual's physical appearance would not be tolerated and the repercussion would be rather strong. Whereas if your page is a fashion blog, some criticism would be normal. Depending on the action or comment made and what kind of site you hold, the consequence would be more or less severe.

Applying community guidelines to your platforms

Community guidelines set out rules to follow, but they can’t prevent toxic content from reaching your platform. For that, you’ll still need someone or something that monitors all the content and decides what respects the community guidelines and what doesn’t.

While manual moderation has its clear limitations, a technology solution could help you monitor and analyze every comment, deleting anything that doesn't comply with your guidelines.

A solution like Bodyguard detects toxic comments and deletes them before they reach your community members and your community managers. Community managers wouldn’t spend heaps of time looking over the content - any harmful content would be deleted instantaneously.

By combining the effort of a community manager with the force of an automatic moderation technology, businesses could help their community managers save time and have a healthier work-life balance. And who wouldn’t want that?

Custom module undefined not implemented