Summary
OpenAI plans to introduce new parental controls for ChatGPT to notify parents if their child using the chatbot appears to be in "acute distress." This comes after a lawsuit was filed against OpenAI by a California couple who claim ChatGPT contributed to their son's death. The company aims to improve safety for teenagers by collaborating with experts.
Key Facts
- OpenAI will introduce a feature that alerts parents if their child using ChatGPT seems to be in "acute distress."
- This update is part of new parental controls to enhance safety for young users.
- A lawsuit was filed against OpenAI by parents in California, claiming the platform contributed to their son's death.
- The lawsuit includes chat logs suggesting the boy had suicidal thoughts validated by ChatGPT.
- ChatGPT requires users to be at least 13 years old, and those under 18 need parental permission.
- OpenAI will work with mental health and youth development experts to shape these new features.
- Other tech companies, like Meta, are also introducing measures to improve online safety for children, such as age verification.