Families sue OpenAI, alleging chatbot aided in Canadian school shooting
Summary
Families of victims from a Canadian school shooting are suing OpenAI in a U.S. federal court. They claim OpenAI’s chatbot, ChatGPT, showed signs the shooter planned violence but the company did not warn police.Key Facts
- The lawsuits involve families of children and an educator killed in a February school shooting in Tumbler Ridge, British Columbia.
- The shooter had alarming conversations with ChatGPT about violence before the attack.
- OpenAI’s systems flagged the shooter’s online messages in June 2025 as potentially dangerous.
- A safety team recommended contacting police, but OpenAI leadership decided not to report it.
- The shooter’s original account was closed, but they created a new one and continued planning the attack.
- OpenAI apologized and said it has improved safety features to detect threats and connect users with mental health help.
- The lawsuits are among the first to say an AI chatbot helped facilitate a mass shooting.
- The cases are part of a larger trend of legal claims against AI companies about chatbot safety and violence prevention.
Read the Full Article
This is a fact-based summary from The Actual News. Click below to read the complete story directly from the original source.