blogs

How to Manage Abusive Content in Your Online Community

Share on facebook
Share on twitter
Share on linkedin

#madetoshare

Table of contents

Community managers know all too well how draining it is to manage user-generated content. You can spend hours browsing through comments, images, events, and groups. Luckily, content reporting is the solution that will keep you – and your team – sane.

There is a better way than moderating content by yourself.

The social technology industry has long recognized the potential to use members to moderate content on social platforms. Instead of manual scrolling, your users can be enabled to report content themselves using a content reporting feature.

Nowadays, content reporting is not just conducted by members but also by artificial intelligence. Some larger companies receive more than 60,000 content reports. I wouldn’t want to be the manager that has to scroll through that.

Algorithms are used to analyze content according to risk level and priority. The best part is that it can be trained according to your actions; making it a perfect fit for you. However (and there always is one when it comes to technology), computers can only recognize the easy stuff.

You still need human moderators to make complex decisions in the grey areas. We’re much better at understanding both human behavior and context.

For example, Facebook’s algorithm removed a horrifying, well yes, but iconic photo of a young boy fleeing a napalm attack in Vietnam. I understand how the algorithm misunderstood the image and how a human might better recognize the historical importance.

Your community members are still more reliable than AI solutions.

That’s why your members are the best solution to help you handle abusive content in your community. And in order to do so effectively, you must make use of its trusty sidekick: your community guidelines.

Two ways to moderate abusive content

1. Define abusive behavior

You need clear and easy-to-find community guidelines and rules to handle abusive content in your community.

Firstly, help members to recognize abusive content by clearly defining them and highlight the importance of their role in keeping this community safe.

Here’s an introduction we prepared for you to use in your guidelines:

As community managers, we cannot be present for every interaction and upload. We rely on our members to report abusive and potentially harmful content. We may make exceptions to these policies based on artistic, educational, or documentary consideration. But there are a few clear lines-we have a zero-tolerance policy toward: …

Naturally, there are universal definitions of what abusive content contains:

  • Threatening or harassing other users;
  • Sexualized content of another person without consent;
  • Content that exploits or abuses children;
  • Content that supports or promotes violent behaviors;
  • Discrimination of others;
  • Defrauding and impersonation;
  • Invasion of privacy;
  • Spamming;

 

However, you may want to add further definitions that specifically apply to your community.

For example, plagiarism of other members in education or ideation communities. Find further instructions from us on how to set up community guidelines.

2. Define strict repercussions

How do you handle offensive members? What happens to repeat offenders? Once a piece of content has been reported, you need to decide how to handle the member that posted the abusive content.

Use your community guidelines to define the consequences for first-time and repeat offenders. These repercussions need to be strict so that abusive behavior is (hopefully) avoided up front.

Typically, first-time offenders are let off with a warning. You may even decide to ban them from posting content for a period of time. On the other hand, repeat offenders are usually expelled from the community. Don’t be afraid to remove these members immediately. In the end, you want to create a place that’s safe for your members and that should be your first priority.

The consequences may also differ according to the type of content that was posted. That’s why you can divide content into:

  • Zero-tolerance content: content that results in immediate expulsion.
  • Abusive content: first-time offenders are let off with a warning.

As a community manager, it will be your job to decide how to handle incoming abusive content and to make the final decision on the repercussions.

Content reporting in Open Social

The content reporting feature in Open Social allows your members to report inappropriate, abusive or spam content.

Report content in your community
Report content in your community

As a community member, you can mark any content you find abusive by:

  • Selecting the content that you wish to report;
  • Selecting a report from the drop-down menu in the corner;
  • Choose a reason why this content is a violation and, if requested, an explanation.

As a community manager, you can enable this feature for your community. You have quite some flexibility with the settings. Such as:

  • You can allow the community to unpublish content immediately
  • You can define the reasons for reporting the content yourself.
  • You can choose whether providing a reason for reporting is mandatory.

 

You can then navigate to an overview, where you can view all the reported content and take action, like deleting it, immediately from the overview.

 

Manage reported content in your community
Manage reported content in your community

Read more about Content Moderation on Open Social here.

You don’t need to manage it alone

You don’t need to spend all your valuable time reviewing community content. Even the most diligent moderators can fall prey to two things: everything begins to look bad or nothing looks bad. That’s when you look to your members.

By enabling the content reporting feature, you don’t only ensure a much healthier community for your members, but you also increase the engagement of the current members.

Download the free 10 Steps For Your First Year of Community Building guide to find out:

1.Why and how you should adjust the look and feel of your online community
2.Why and how you should run a Beta community
3.Things to consider regarding community moderation
4.Why and how you can set up a content calendar
5.Tips and tricks to increase engagement in the first year of your community
6.How you can use social media to find new users and boost your community
7.How you can use our notifications to increase engagement
8.How to source co-community managers from within your community
9.The benefits of on- and offline meetups to solidify your community
10.How to use the Open Social Analytics suite to improve your strategy

Download Your Free Guide Here

Share your thoughts with us

Related articles

Community Management

Why associations should own the continuous conversation

Community Management

5 Eye-Opening Online Community Trends for 2021

Community Management

A new era for international associations – there is no turning back!

We are frequently sharing knowledge
and inspiration. Don’t miss it!

subscribe to our newsletter

get all the knowledge you need

join 2300 other subscribers

Get Free Access to 10+ comparisons

You also get the link in your inbox

Download the comparison to a.o.

Start A Discussion
Share your Thoughts

Share your thoughts on How to Manage Abusive Content in Your Online Community