What tools can I use to moderate?

Learn about Disciple's tools to help manage challenging members.

We have various tools to help you manage and moderate your members to maintain a healthy and positive vibe in your community. 

You should decide how you'd like your members to behave, and what happens if they don't. Agree and share these guidelines and rules with anyone in your community or team that will be helping moderate your community. 

You can also add them to the "Terms Of Service", which can be set in the Legal docs section of the Console. For an example, see Section 4 of The Collective rules here

Moderation tools


Members can block other members. This means they will stop seeing content from blocked members - and the blocked member will be unable to see content from the member who blocked them. The block option is available on the member profile of the member you want to block, and you can unblock in the same place.

If a member has done something worth blocking them for - it’s a good idea to encourage members to report them too so you can investigate and take any other appropriate action.

Members can block someone using the 'three dots' menu on the profile page of the member they want to block.


Members can report other member's posts as well as member accounts. Reports of both posts and members get sent to the Moderation section in the console, where you can review them and take action accordingly (more on this below)

Any member (including hosts) can be made a "trusted reporter" which means when they report a post, it will be automatically and permanently unpublished. This is a great tool if you want to recruit some of your members to quickly remove undesirable content. 

Posts can be reported using the 'three dots' menu next to them, and members using the same menu on the member profile page.

Shadowban and Disable

A host can shadowban members, meaning they can login and post, but their content won’t be visible to other members. The member isn’t notified they’ve been shadowbanned. As host, you can still see posts they make from the Console. When a Shadowban is removed, any posts made during that time will become visible again (unless you unpublished them) Shadowban is available in the Console on Member profiles. Members who have the ‘Moderator’ permission can also Shadowban from the community itself.

A host  can disable and disable & unpublish content for any member via the Console. This stops the member being able to login or use the community, but they aren’t sent any notification that their account is disabled.  You can enable a member again when the moderation issue has been resolved, or leave the account disabled to stop them being able to login or signing up again with the same email address. If you unpublish their content when disabling, it cannot be republished when enabling their account. Disable is available in the Console from the Member List, or Member profiles. Members with the ‘Moderator’ permission can also disable accounts from the community itself.

Hosts can also use Shadowban and Disable together to stop a member being able to login, and make all their posts invisible while you review and unpublish any posts as needed before reactivating their account.

You can Shadowban and Disable from the members profile page in the Console.


A host can delete a member account altogether, and this also gives you the option to delete all their content at the same time. An account cannot be restored once deleted, but a member can signup again with the same email address if their account is deleted. The member is sent an email letting them know their account has been deleted on the community. If you are removing a user for breaking your community guidelines or rules, we recommend using disable, not delete - as this stops them signing back up with the same email address. 

You can delete members from the Member list page in the Console

Email confirmation

Email confirmation is available for your app to ensure only members with a valid email can use your community. This can help stop spam or bot accounts from posting. An account that hasn’t confirmed their email address cannot post or engage in the community. Email confirmation is off by default, but we recommend switching it on before you launch your community, or if you start to see problems with spam or unwanted content.

You can also add a CAPTCHA test to the email confirmation process to help stop automated registrations from confirming their email and then posting in the community. 

You can switch CAPTCHA on or off from the Privacy & Security section of the Console.You must have Email confirmation switched on to be able to switch on the CAPTCHA check, as it’s part of the confirmation process.

Signup protection

If you see high volumes of suspicious signups occurring, you can enable the signup protection setting. For security reasons, we won’t go into detail about what this does but it adds extra checks for suspicious activity when someone signs up. 

This setting could stop genuine users signing up too, so by default it’s off, and you should use it with caution. If you want to better understand if this setting might help you, contact support.

You can switch this on and off from the Privacy & Security section of the console.

Trusted reporters

Any community member (including hosts) can be made a trusted reporter so when they report a post, it will be automatically unpublished. This is a great tool if you want to recruit some of your app members to help moderate the community. A trusted reporter will still generate a report to the Moderation queue for you to review the report reason and decide if you need to take more action than the post being unpublished. If you need to republish the post, you can do this via the Posts section in the console. 

Edit any member account via the member list in the Console to make them a trusted reporter.

Moderator permission

Any member who is assigned the Moderator permission can Shadowban and Disable members from the web community. This is a powerful permission and we recommend you only give it to members of your community team who already have access to the Console. This means if they spot a problematic member in the process of browsing or engaging in community, they can quickly shadowban or disable without having to go to the Console.

Moderation queue

A report of a piece of content or a member goes into the Moderation section in the Console. This gives you the ability to see the issues being reported by your members, review, and take action if required. 

The Moderation queue groups reports where something or someone has been reported more than once - so you can quickly focus on things causing the most reports, and can mark all the reports for an item as ‘closed’ in one group, saving you time.

You can click through to see the content or member being reported if you need to investigate further, and also unpublish a post or shadowban a user direct from the queue. 

Moderation approach

Whenever you do any shadowbanning or unpublishing posts, we advise to get in touch with a problematic member to explain what they have done wrong and what action has been taken. This gives the member the opportunity to adjust their behaviour in future.  Many communities use a ‘3 strikes’ approach for behaviour management, allowing members 3 warnings before issuing a temporary or permanent ban (disabling or even deleting the account). 


*Please remember that followers of shadowbanned members will still receive notifications whenever they post. This is done to not raise suspicion.


Disciple do not provide moderation support but we can recommend some suppliers. 


Looking to create your own community app? Contact our Community Experts -  info@disciplemedia.com

Need help with your existing Disciple powered community? Contact our Customer Support team - help@disciplemedia.com