OpenAI sued for 'assisting' Canada school attacker

3 Minutes Read Listen to Article

March 10, 2026 08:45 IST

x

According to the lawsuit, the company had 'specific knowledge' that the shooter had been using ChatGPT to plan a mass-casualty attack similar to the February 10 shooting in Tumbler Ridge in the western province of British Columbia.

Canada shooting: OpenAI sued

Illustration: Dado Ruvic/Reuters

Key Points

  • The family of a student injured in a school shooting in Canada has filed a civil lawsuit against OpenAI, alleging its chatbot ChatGPT had indications the attacker was planning a mass shooting.
  • The lawsuit claims OpenAI had specific knowledge that the shooter was using ChatGPT to plan a mass-casualty attack before the February 10 shooting in Tumbler Ridge, British Columbia.
  • The gunman, Jesse Van Roostselaar, killed eight people before taking her own life, making it one of the deadliest school shootings in Canada in recent years.
  • Victim Maya Gebala was shot three times and suffered catastrophic brain injuries, leaving her with permanent cognitive and physical disabilities, according to the lawsuit.

The family of a teenage girl critically injured in a school shooting in Canada has filed a civil lawsuit against artificial intelligence company OpenAI, alleging that its chatbot ChatGPT had prior indications that the attacker was planning a mass shooting, the Associated Press reported.

According to the lawsuit filed in the British Columbia Supreme Court, the company had 'specific knowledge' that the shooter had been using ChatGPT to plan a mass-casualty attack similar to the February 10 shooting in Tumbler Ridge in the western province of British Columbia.

The attack was carried out by Jesse Van Roostselaar, who killed eight people before taking her own life, making it one of the deadliest school shootings in Canada in recent years.

The lawsuit claims OpenAI had detected troubling activity on the attacker's ChatGPT account months before the incident, said the AP report. 

OpenAI 'considered' alerting authorities

The company had reportedly considered alerting law enforcement but ultimately did not do so.

OpenAI later informed police that the suspect's ChatGPT account had been closed due to concerns, but she allegedly bypassed the ban by creating another account.

The legal filing further alleges that the chatbot acted as a 'trusted confidante, collaborator and ally' for the attacker and assisted in planning a mass-casualty event.

The case was brought by the family of Maya Gebala, a student who was shot three times during the attack.

According to the lawsuit, one bullet struck her head, another hit her neck and a third grazed her cheek, leaving her with severe brain injuries and permanent cognitive and physical disabilities.

OpenAI did not immediately respond to a request for comment on the lawsuit.