A new approach for maximum transparency.

5 August 2020 - Becca Melhuish

Find out how our community moderation features bring complete transparency and accountability to the moderation process.

Citizen OS software is pictured on the screens of both an iPhone and iPad.

More often than not, moderators in online communities are invisible, unmonitored and unaccountable. They rarely have to justify their decisions, and there is seldom any public record of content they have censored. This bestows moderators with a great amount of power—allowing them to secretly change the course of a discussion to suit their own agenda, should they so wish.

But thanks to an update on the Citizen OS platform, for our users this is no longer the case. We’ve redesigned our moderation process to bring maximum transparency—something we value extremely highly, as one of the vital cornerstones of participatory democracy.

Community moderation

Now, by default, groups using the Citizen OS platform for collective decision-making will be moderated by a team of their own users, through our new community moderation process. This team is composed of all those participants of a discussion topic who mark themselves as willing to moderate—thereby volunteering themselves for the role.

If a user reports an offensive comment or topic on the platform, it is then flagged up to all members of the moderation team. Whichever moderator responds to the report must then leave an explanation—justifying their decision for either hiding the content or allowing it to remain.

If a moderator chooses to hide the content, the original author of the content is notified of the decision, along with the reasons provided by the moderator. The author then has the opportunity to make the necessary edits, to remove whatever had caused offence, and allow the content to be made visible once again.

For complete transparency, no content can be completely deleted by a moderator—it can only be hidden. This means that other users can easily track the whole moderation process: seeing what content has been reported and why, what action was taken by which moderator and why, and—if they choose to do so—even viewing the original offending content themselves, by selecting the option to “view previous versions”. The only part of the process which is not visible to all other users is the identity of the person who reported the offensive content, to ensure that no-one would feel afraid or uncomfortable in raising an issue.

While unlikely, there’s always a chance that no-one steps forwards to join the moderation team. In this case, moderator rights are automatically allocated to the administrator of the topic. Just like the group moderation process, the admin is required to be equally transparent—still having to justify their decisions publically to the users, and having the power only to hide comments, rather than fully delete them.

The bigger picture

Margo Loor, our CEO, explains how these recent developments fit into Citizen OS’s vision for the future:

“Our new community moderation features help fulfil some of our key aims as an organisation—facilitating collective online decision-making through respectful discussions, and with maximum transparency. It’s one more step in our journey of making participatory e-democracy a viable and desirable path for society to take.”

Cookie icon

We use cookies to personalize content, to provide social media features and to analyze our traffic. By clicking "Accept" or continuing to use our services you accept our use of cookies. Read our Legal Terms.