San Francisco, Jun 17 (IANS): Chat platform Discord has introduced a new autonomous moderation tool for servers to preemptively detect and block harmful messages and spam.
AutoMod comes equipped with keyword filters that can automatically detect, block, and alert you of messages containing harmful words or phrases before they are ever posted in your #text-channels, Threads, and inside your Text Chat in Voice.
"Since AutoMod's built directly into your Discord server, it is a great starting point for new communities looking to proactively keep their server safe," the company said in a blogpost.
"For long-standing servers with a seasoned team of moderators and self-crafted mod bots, AutoMod acts as an additional layer of protection in case something goes awry," it added.
You can even have users who try to post harmful words or phrases be Timed Out automatically, so they will not be able to continue posting until you are back.
As a one-stop shop for content moderation filters, the new AutoMod lives under the Content Moderation tab within Community-enabled Server Settings.
If you have chosen to have AutoMod send you alerts about flagged messages, you can specify it to post the alerts to a text channel of your choice.
There, your moderation team can view a flagged message to determine the best course of action, whether it's removing the message itself, timing out the user who tried to post it, or allowing it to remain.