Summary
Two former safety researchers from Meta have claimed that the company covered up risks associated with its virtual reality (VR) products that could harm children. They testified before a US Senate committee, alleging that Meta ignored problems and deleted evidence of potential risks. Meta denies these allegations, pointing to numerous studies on youth safety it has conducted.
Key Facts
- Two former Meta researchers testified about their concerns over child safety with Meta's VR products.
- They claimed that Meta hid evidence showing potential harm, including risks of sexual abuse.
- The researchers said that Meta asked them to avoid studies that might show harm to children.
- Meta, the owner of Facebook, Instagram, and WhatsApp, denies these claims, calling them untrue.
- Meta stated it supports research on safety, with numerous approved studies on related issues.
- One researcher claimed that Roblox on Meta's VR platform was used for inappropriate activities.
- Roblox disagreed with these claims, asserting safety is their top priority with constant moderation.
- Meta provides parental control tools on VR products but some, like Senator Ashley Moody, found them hard to use.