Can your community police itself?
Community-led moderation—a method well-researched and tested across the web–employs dedicated, loyal community members appointed by a publisher to review the community’s conversations and handle contributors who run afoul of the guidelines.
Just think of Wikipedia or Reddit (don’t worry, we’ll get into these examples below). On these sites, the community itself plays an active role in driving high-quality contributions.
Despite these sites’ successes, they’ve certainly had their fair share of controversies. So, while the community is an important pillar of moderation, it can’t be the end-all, be-all of your moderation strategy. You need community moderation tools.
That’s why OpenWeb uses multiple layers of moderation to examine the millions of conversations we host every day. On top of AI-based algorithms and third-party integrations, we provide professional moderator services to provide a human touch, ensuring that all moderation decisions are as accurate as possible.
But combined, moderation technology and human moderation are a powerful force in a publisher’s community moderation tools. So, why not empower your best and most loyal users with the right tools to become community moderators?
Our partners who have embraced community-led moderation have seen increased engagement, higher quality conversations, and improved overall moderation. In fact, there are a number of platforms that give their community members the ability to take action on comments that don’t meet their site’s guidelines.
For instance, Wikimedia, parent of Wikipedia, famously works with thousands of dedicated “Wikimedians” around the world to power its services. On Reddit, subreddit creators and users they appoint make the majority of the moderation decisions on the platform.
Publishers who leverage community moderation tools can benefit in a few ways:
1. Establish thorough, human review
Community-led moderation allows publishers to thoroughly review more comments within their community. This ensures that when questionable content gets flagged either by the moderation platform or by a publisher’s readers, an accurate, human review is possible. This helps publishers improve their moderation efforts and ensure that nothing falls through the cracks.
2. Promote the sense of ownership
When top contributors are given the responsibility of maintaining the quality of their own community, they become the positive role models—inspiring others to follow. They are driven to support informed conversations, preserve their community’s ethos, and strengthen the sense of belonging. Moderators tend to contribute more and so do others in the community, increasing overall engagement.
3. Increase quality of conversations
Research suggests that non-anonymous commenters are nearly three times as likely to remain civil in their comments as those who were anonymous. Positive and civil role models are able to dictate norms and encourage other users to follow suit, leading to higher-quality conversations—and quality conversations lead to more users staying on-site longer and interacting more.
4. Turn your community into lifetime value
By offering top users a stake in the community’s overall well-being, you essentially identify them as the leaders within the community and turn them into long-term valued members. They not only safeguard your policies and guidelines, but also contribute to increased benefits such as increased time spent, page views, retention, and more.
Building user-moderated communities on the open web
Community-led moderation benefits both the user and the publisher: users feel valued when they have a sense of ownership and accountability in their community. Publishers who leverage users to maintain their site policies can build stronger, more engaged communities.
Of course, advocating for a user-led moderation in the publishing industry is far from simple. For most publications, the idea of giving users this type of power sounds risky, to say the least. We understand that. So we decided to try this concept with a “do things that don’t scale” approach. Watch this space for more updates.