Summary
A report by Global Witness found that TikTok's algorithm was suggesting pornographic and sexualized content to fake child accounts, even when safety settings were activated. Despite TikTok's efforts to provide age-appropriate experiences, the report highlighted continued issues with inappropriate content being recommended to young users.
Key Facts
- Researchers from Global Witness created four fake TikTok accounts pretending to be 13-year-olds.
- The accounts still received pornographic content recommendations despite using TikTok's "restricted mode" meant to block such material.
- Suggested search terms led to explicit videos, including sexual acts and provocative images.
- Videos appeared under "you may like" recommendations, evading TikTok's content moderation.
- Global Witness informed TikTok, which claimed to take immediate action to address the problem.
- TikTok states it has over 50 features to ensure teen safety and removes most violative content before it's seen.
- The Online Safety Act's Children's Codes came into force on July 25, requiring platforms to protect children from harmful content.
- Global Witness found pornographic content suggested again in late July and August, after the Children's Codes were enacted.