Summary
Regulators in the UK have asked major social media companies to improve their age checks to prevent children under 13 from using their platforms. The companies involved include Facebook, Instagram, Snapchat, TikTok, YouTube, Roblox, and X. These platforms are currently criticized for relying on self-reported ages, which can lead to underage access.
Key Facts
- UK regulators Ofcom and the Information Commissioner's Office are involved in the request.
- Social media companies contacted include Facebook, Instagram, Snapchat, TikTok, YouTube, Roblox, and X.
- Current age checks allow children under 13 to easily access social media by lying about their age.
- Ofcom says these companies are not focusing enough on children's safety.
- The UK wants firms to adopt stronger age checks, similar to those used for adult content online.
- The Information Commissioner's Office emphasized that platforms have no legal basis to handle data of users under 13 without proper safeguards.
- Many platforms like Meta, Snapchat, and TikTok have started or are testing new age verification tools.
- YouTube and other companies argue they already have systems in place and call for a focus on higher-risk services.