Social, live streaming, and dating apps have long been the focus of policy scrutiny on platforms like Google Play due to their content generation models (UGC) and high-frequency user interactions. Starting in early December, "social and dating apps" will begin receiving notifications from Google Play about the "Children’s Safety Standards" policy declaration form, as shown in the image below. This means that all "social" or "dating" apps must provide the published safety standards and contact information to comply with Google Play's "Children's Safety Standards" policy.
How should developers correctly fill out the "Children’s Safety Standards" policy declaration form?
-
Prepare a "Safety Standards" URL, including CSAE in the Terms of Service: Provide Google with a valid link to comply with "Child Sexual Abuse and Exploitation" (CSAE) prevention; the developer must include a statement about complying with CSAE standards in the "Privacy Policy" or "Terms of Service," ensuring that the link is always accessible. A sample reference for CSAE prevention standards:
-
Our Commitment: We are dedicated to protecting children’s safety, preventing any form of sexual abuse, exploitation, and related harmful behavior. Our app complies with all applicable laws and regulations and the "Children’s Safety Standards" policy, always prioritizing children's safety and rights.
-
Scope: This standard applies to all users and related content, including but not limited to user-generated content, interaction behavior, reporting mechanisms, and content review. Our goal is to create a safe and inclusive social environment, especially providing extra protection for underage users.
-
Core Measures:
-
Responsibility and Transparency: We regularly update and publicly disclose policies and measures related to child protection, providing transparency reports to users about our implementation and improvement plans.
-
Contact Us: If you find any content or behavior related to child sexual abuse or exploitation, please contact us via: Email: XXXXX@gmail.com or use the "Feedback" and "Report" features within the app.
-
Developers should add these child safety-related policy contents to the app's user agreement.
-
Strict Age Limits: Users must confirm their age upon registration, with additional protection for minors, including restricting access to certain features.
-
Content Review: All user-generated content (UGC) undergoes both manual and algorithmic reviews to prevent content related to sexual abuse, exploitation, or improper behavior from appearing on the platform.
-
User Behavior Standards: Any sexualized behavior or improper actions related to children are prohibited. Such behavior will be immediately handled and reported to the relevant authorities.
-
Reporting and Response Mechanism: We provide user-friendly reporting tools, allowing users to quickly report suspicious content or behavior. Our dedicated team will address issues within 24 hours of receiving a report.
-
Cooperation and Reporting: We cooperate with child protection organizations and law enforcement agencies to ensure timely investigation and reporting of any suspected CSAE cases.
-
New "Children’s Content" Reporting Feature in the App: Developers must provide a “Children’s Safety” reporting type entry within the app. Developers can choose any suitable method as long as users can use this function without leaving the app. Methods may include comprehensive in-app feedback tools, email contact, or chat channels for receiving reports. Developers must prove in the Play Console that they have provided the corresponding in-app feedback mechanism.
“Children’s Safety Standards” Policy Timeline
Google Play plans to implement the "Children’s Safety Standards" policy according to the following schedule. Please note that this timeline may change, and updates will be posted in this article.
-
April 2024: Google announces new "Children’s Safety Standards" policy.
-
December 2024: All social and dating apps begin receiving notifications of the "Children’s Safety Standards" policy declaration form.
-
January 22, 2025: All applicable apps (social or dating) must fully comply with the policy. After this date, all non-compliant apps will be removed from Google Play.
According to the policy timeline released by Google, social apps online have about two months of buffer time to ensure developers have enough time to adjust and optimize.
Social Apps Face Another Wave of Removal! How Should Developers Respond?
On December 6, major social live streaming apps, including Bigo Live, Tango, and LiveMe, were removed simultaneously by both Apple and Google. Nearly 20 days have passed since this "earthquake" in the social app industry. As of this writing, Tango and Bigo Live have been reinstated on Google Play, but other apps have not yet seen any changes. It’s worth noting that Tango was reinstated just 7 days after removal, and the version is the same as the one removed on December 6 (version 8.80.1732719153). Bigo Live, after two weeks of adjustment, was reinstated on the afternoon of December 20, with its previous version (6.22.4) updated to version 6.23.2.
Did Google only pause Tango’s app while removing Bigo Live and other products? This is a possibility, but we still don’t know how Tango managed to complete the rectification and be reinstated in just a week.
How should developers respond to ensure stable operation amid frequent removals of sensitive apps like social apps?
-
External Standards: Include CSAE-related requirements in the terms of service, usage policies, and community guidelines, clearly prohibiting related behaviors.
-
Internal Safety Guidelines: Companies should create internal guidelines on child safety to help identify and address CSAE issues, along with execution processes.
-
Smooth Reporting Channels: Set up a reporting mechanism that complies with local laws, allowing users to easily report potential child abuse, especially providing reporting options within the platform.
-
CSAE Detection: Use technology (such as hash matching, image classifiers, keyword extraction, etc.) to detect CSAE content, including images, videos, and text on the platform.
-
Enforce Penalties: Platforms should take appropriate actions for policy violations, such as removing inappropriate content, banning accounts, and disabling devices.
-
Cooperate with Law Enforcement: Within the scope of the law, platforms should cooperate with law enforcement, providing necessary user data and case investigation materials.
Conclusion: In summary, the new "Children’s Safety Standards" policy will come into full effect on January 22, 2025. To avoid any disruptions, developers should start preparing early, stay updated with Google's policy changes, and make necessary adjustments.
Finally, creating high-quality content is not easy. Feel free to follow my public account. If you have more insights or questions about the policy update in Google Play, feel free to communicate with us in the comment section. Add me on WeChat: kris_wuii, and join my GP Overseas group to exchange and learn together.