At Wildlife Studios, our goal is to create games that foster safe, fair, and inclusive environments for all of our players. To do this, we use a combination of technology and people to identify and remove violations of our Code of Conduct. We also continually improve our safety policies and products to support the overall safety of our games. Our Fair Play and Safety Policy has been designed with fostering an experience that prioritises safety, inclusion, and fairness in mind. Our policies apply to everyone and all content, and we strive to be consistent and equitable in our enforcement.
To provide insight into these efforts and visibility into the nature and volume of content reported in our games, we publish transparency reports twice a year. This report will highlight how we enforce safety and privacy across our platform. We will also go beyond simply reporting our data; here we provide additional context and insight into our safety principles, policies, and practices, as well as links to various safety and privacy resources.
This report covers October 2021 to March 2022. This report shares data about the global number of in-game chat content that was flagged for some form of inappropriate content and enforcement levels.
Overview of Content and Account Violations
From October 1st, 2021, to March 1st, 2022, we proactively enforced against 54,719 pieces of content globally that violated our in-game chat policies
Enforcement actions include removing the offending content or terminating the account in question.
During the reporting period, we saw a Violative View Rate (VVR) 0.028% which means that out of every 5,000 messages approximately one was found to violate our Fair Play and Safety Policy.
As part of our goal to create fair, safe, and inclusive environments, we take a proactive approach to shield our players from harmful content. With this in mind, we will automatically redact any message that our AI flags as containing profanity, sexual content or hate speech. In the reporting period, we redacted 260,835 pieces of content in real-time.
Overview of Content and Account Violations
|Total in-game messages scanned by our AI-Moderation tool||33,314,747|
|Total cases reviewed by our Trust & Safety Team||100,697|
|Reason||In-game message content enforced|
|Threats & Violence||77|
This section provides more details about some of the policy areas we take action on.
Combating Child Sexual Exploitation
Sexually exploiting a minor is illegal and is prohibited by our Fair Play And Safety Policy. Preventing, detecting-Reporting and eradicating Child Sexual Exploitation Material is a top priority for us. We are continuously evolving our policies and technology to be able to address child exploitation material in the most effective way possible.
We report all incidents of Child Sexual Exploitation, rather this be imaged-based exploitation or text-based, to National Center for Missing and Exploited Children (NCMEC) then, in turn, coordinates with domestic or international law enforcement, as required.
We use very sensitive filters to ensure that if any content has even a minor possibility of containing Child Sexual Exploitation, it will be flagged and surfaced for manual review.
As a result of these filters, 17,745 cases were flagged and manually reviewed.
In 2021 (12 months) we made 84 reports to NCMEC National Center for Missing and Exploited Children.
Self-Harm & Suicide Content
We care deeply about the mental health and well-being of our players, which has influenced many of the decisions we have made about how to best build Wildlife support systems. When our Trust and Safety team recognises a player who may be in distress, we will forward self-harm resources so that the player can reach out to a professional resource to receive help. We also have made a list of resources globally available on our wellbeing and resource center https://wildlifestudios.com/safety-center/
We sent 1,060 messages containing helpline resources to players during the reporting period.
Additional Safety Resources
Wildlife Studios are deeply committed to the safety and well-being of our players. Here are some of the additional resources that further explain and inform our policies and actions.
Our safety center was created as a space to inform and engage all of our players on how to keep our games safe. Our safety center houses numerous resources such as our Fair Play and Safety Policy and Information For Guardians. These resources are aimed to equip our players and guardians with the resources they need to maintain safe gameplay.
Our Fair Play and Safety Policy is the foundation of our safety principles and is the basis on which we form our internal policies and operating procedures. We continuously review our policies to ensure that they are updated in accordance with trends and current events.
Whilst our Trust and safety teams and our advanced AI operate 24/7 to help keep our games safe, we also rely on our players to report incidents of harmful content to us via our in-game report system. During the reporting period our team handled 48,692 user reports.
The well-being and safety of our players is a top priority for us. In order to make sure that all of our players and their friends and family members have access to resources we have developed proactive in-game tools such as here for you. This will provide proactive support to those who may be experiencing mental health issues. We have also developed our well-being resource center. Here we house global helplines for a multitude of services along with advice for players who may have a friend or family member who is in distress.
Situations may arise where law enforcement or other authorities seek access to information held by Wildlife Studios. We intend to comply with our legal obligations in each of the jurisdictions we operate in, however also believe that personal data should only be provided when we reasonably believe it is legally required that we do so.
Government representatives and public authorities can contact Wildlife Studios to learn more about how we handle law enforcement or other requests. We have a team dedicated to handling these requests.
- 53% of requests were sent from authorized law enforcement representatives (e.g., police officer, federal agent) and 47% from government representatives (e.g., district attorney, minister).
- The most common requests were classified as Subpoenas, with 38% overall. The least common requests were classified as Court orders, with 14%. Also, 29% of requests refer to Preservation requests and 19% to Search warrants.
- The vast majority of rejections to requests for information result from incomplete or partial data (80%). Others are rejected for being considered frivolous (10%) or unrelated to law enforcement (10%).
We strive to provide clear and accessible information to our players on how we use their data, and why. We are also making continuous efforts to provide efficient controls for our players and to ensure they have easy and quick support when it comes to their personal data.
Players can contact Wildlife to learn more about how we handle their data or make privacy requests. We have a team working from Legal to Trust & Safety and Engineering dedicated to managing these requests.
- Most privacy requests refer to players requesting a copy of their data - 25%, followed by requests for data deletion - 19%.
- The least common requests refer to players objecting to their data being processed and requests for a review on automated decision making - both 2%.
- 45% of players choose not to disclose where they are located. From those that do, most requests, at the moment before of the choice, come from the United States - 9% overall. India and Russia complete the Top 3 with 7% and 6%, respectively.
- The vast majority of rejections happen because requests are not related to privacy - 98% of all rejections. These requests are nonetheless routed to player support teams, and this pattern led to a revamp of our UI and UX in early 2022.
Our games are not meant for kids, and - except for specific circumstances where we obtain parental consent -, we do not have knowledge of children playing our games.
If we do learn of kids, our team promptly deletes personal information and closes the player account. From October 2021 to March 2022, we have undertaken that procedure 70 times.
We applaud Wildlife Studios' commitment to user safety and transparency and are proud to partner with them using our AI content moderation technologies to help them achieve their goals of building more inclusive and resilient communities globally.
CEO and Co-Founder Spectrum Labs