OpenAI is facing a new lawsuit alleging the company failed to warn police after ChatGPT was linked to one of Canada’s deadliest school shootings in Tumbler Ridge, British Columbia. The lawsuit, filed in federal court in Northern California, accuses OpenAI of negligence, failure to warn authorities, product liability, and helping to enable the mass shooting.
The case stems from a February 2026 mass shooting where 18-year-old Jesse Van Rootselaar killed her mother and 11-year-old stepbrother before opening fire at Tumbler Ridge Secondary School, killing five children and one educator before dying by suicide. Among the injured was a 12-year-old minor identified as M.G., who was shot three times and remains hospitalized with catastrophic brain injuries.
According to the lawsuit, OpenAI's automated systems flagged Van Rootselaar's ChatGPT account in June 2025 for conversations involving gun violence and planning. Members of OpenAI's safety team reviewed the chats and recommended notifying the Royal Canadian Mounted Police, but the lawsuit alleges that OpenAI leaders overruled these internal recommendations. Instead, the company deactivated Van Rootselaar's account without notifying police, and she was able to create a new account with a different email address.
The plaintiffs argue that ChatGPT deepened the shooter's violent fixation through features like memory and conversational continuity, while OpenAI weakened safeguards in 2024 by moving away from outright refusals in conversations involving imminent harm.
Last week, OpenAI CEO Sam Altman publicly apologized to the Tumbler Ridge community for the company's failure to alert police. An OpenAI spokesperson stated that the company has strengthened its safeguards, including improving how ChatGPT responds to signs of distress and connecting people with mental health resources.
This is the second lawsuit linking OpenAI's chatbot to a homicide, following a December 2025 case involving a fatal shooting in Connecticut.