Summary
A woman named Kelly used AI chatbots from character.ai for support while waiting for mental health therapy. These chatbots provided her with a sense of encouragement and availability, although concerns exist about the reliability and safety of AI chatbots for mental health advice.
Key Facts
- Kelly used AI chatbots to help cope with anxiety and low self-esteem while waiting for traditional therapy.
- Chatbots offer 24-hour availability but are considered less reliable than professional therapy.
- AI chatbots have been involved in legal cases due to giving potentially harmful advice.
- Chatbots like Wysa are used by about 30 local NHS services to aid mental health.
- There is a significant waiting list for mental health services, with around one million people affected.
- AI technologies, including chatbots, are becoming more common in healthcare for various tasks.
- Experts have concerns about chatbots, including biases, information security, and lack of proper safeguards.
- In 2024, mental health referrals in England increased by 40% over five years.