How do we moderate our games?

Back page Back

How do we moderate our games? 

Wildlife carries out daily moderation operations via a diverse set of methods and tools. For the purpose of transparency, this section outlines each method used to keep our communities safe for all players. 

Automated Moderation

All of our games with chat features make use of an AI moderator that will censor certain comments if they are found to violate the following policies:

  • Profanity
  • Insults
  • Political Speech
  • Religious Speech

If you believe that your comment was censored in error, you will have the option to appeal this by tapping the appeal button that appears on your screen when the comment is censored. This will send your comment to a human moderator for review, and if we can confirm any errors on our side, we will flag this with our  AI partner to apply the necessary tweaks to our automations. 

Automated Redaction

The above policies are not the only ones we automatically censor comments for. The following policies are also censored: 

  • Hate Speech
  • Sexual Content

These two policies are unique in that when a comment is censored for Hate Speech or Sexual Content, it will be sent to one of our human moderators for review, and if confirmed to be in violation of our code of conduct, the appropriate actions will be taken against the responsible account. The punishments for these policies are outlined on their respective pages. 

User Reports

Although we have taken great measures to ensure that our automated moderation system keeps our players as safe as possible from harmful content, we acknowledge that sometimes such content can go undetected. For this reason our games also feature a user report button, where you the player can directly bring harmful content to our attention from within the app. 

All reports sent to us will be reviewed by a human moderator, and if a piece of reported content is found to be in violation of our code of conduct, the following actions will be taken: 

  • Appropriate actions will be taken against the responsible account. 
  • The content will also be flagged with the team in charge of the automated moderation process in order to prevent similar content from going undetected in future. 

Types of Punishments Applied to Accounts

  • Code of Conduct Warnings
  • Temporary chat suspensions
  • Temporary game suspensions
  • Permanent chat suspension
  • Permanent game account closure