Victims' Families Sue OpenAI for Failing to Report Shooter's ChatGPT Behavior
Families of seven victims in a British Columbia high school shooting are suing OpenAI and its CEO for negligence, claiming the company knew of the shooter's violent intent but failed to notify authorities. The lawsuits allege employees flagged the shooter's account eight months before the attack, but leadership overruled reporting to law enforcement, leading to a massacre that killed six and wounded 27.
Families of seven victims in a shooting at a British Columbia high school are suing OpenAI and its CEO for negligence after the company allegedly failed to warn authorities about the shooter's concerning conversations with ChatGPT.
The lawsuits, filed Wednesday in federal court in San Francisco, allege that the shooter's violent intentions, identified as 18-year-old Jesse Van Rootselaar, were known to OpenAI. Employees flagged the shooter's account eight months before the attack and determined it posed "a credible and specific threat of gun violence against real people," according to the complaint.
The families allege that employees urged Sam Altman, OpenAI's CEO, and other senior leaders to notify Canadian law enforcement eight months before the attack, but the company decided not to warn authorities and instead disabled the shooter's account. This is largely based on accounts that internal employees shared with the Wall Street Journal.
The decision not to alert law enforcement led to the devastation of the rural community of Tumbler Ridge, the lawsuits allege, where on February 10, the shooter stormed into the high school with a modified rifle and opened fire. He shot the first person he encountered on the stairs, then moved to the library, where he killed five more people and wounded 27 others. The shooter then killed himself.
Before going to the school, the shooter killed his mother and 11-year-old brother at their home.
The victims in the school ranged in age from 12 to 13 and included a 39-year-old teaching assistant. One of the survivors, 12-year-old Maya Gebala, was shot in the head, neck, and cheek. She has been in intensive care at Vancouver Children's Hospital since the shooting and has undergone four brain surgeries. If she survives, she will likely be permanently disabled, her lawyer said.
The families filed seven lawsuits alleging negligence, aiding and abetting a mass shooting, wrongful death, and product liability against OpenAI and Altman. Their lawyer said this is the first wave of lawsuits against the AI company related to the shooting, and about two dozen more will be filed.
In a statement to the Guardian, OpenAI said: "The events in Tumbler Ridge are a tragedy. We have a zero-tolerance policy for using our tools to support violent crime. As we shared with Canadian officials, we have strengthened our safeguards, including improving how ChatGPT responds to signs of distress, connecting people to local and mental health support resources, strengthening how we assess and escalate potential violent threats, and improving detection of repeat policy violators."
After the Guardian contacted OpenAI for comment, the company published a new blog post about its "safety commitment" and how it "protects community safety."
The attack was one of the deadliest mass shootings in Canadian history. Afterwards, questions swirled around the small community about how it could have happened.
Van Rootselaar's ChatGPT account was banned eight months before the shooting, after OpenAI's safety team flagged violent conversations, according to the complaint. However, the shooter was able to quickly create a new account, the lawsuits allege.
Although OpenAI says the shooter created a second account that the company was unaware of until after the shooting, the lawsuits say the company provides users with instructions on how to return to ChatGPT if they are disabled, and the shooter followed them.
"The fact that Sam and leadership overruled the safety team, and then children died, adults died, an entire town was destroyed, is pretty close to the definition of evil to me," said Jay Edelson, the lead attorney representing the Tumbler Ridge plaintiffs.
The lawsuits allege that the choice to conceal the shooter's interactions with ChatGPT from Canadian authorities, and later tell the public that the shooter had snuck back onto the platform, was made for the sake of "corporate survival" and to protect the company's IPO, with an expected valuation of $1 trillion that could make Altman one of the richest people in the world.
OpenAI has refused to share the logs between its chatbot and the Tumbler Ridge shooter, Edelson said.
Last weekend, Altman sent a letter to the Tumbler Ridge community apologizing for not informing Canadian police about what OpenAI knew regarding the shooter's potential threat.
"Although I know words are never enough, I believe an apology is necessary to acknowledge the irreparable harm and loss your community has suffered," Altman wrote. "I reiterate the commitment made to the mayor and premier to find ways to prevent tragedies like this in the future."
David Eby, the premier of British Columbia, posted the letter on social media with the comment: "An apology is necessary, but absolutely insufficient for the devastation caused to the families of Tumbler Ridge."
On February 26, more than two weeks after the shooting, OpenAI's vice president of global policy, Ann O'Leary, sent a letter to Evan Solomon, Canada's minister of artificial intelligence and digital innovation. O'Leary wrote that based on what the company saw when the shooter's account was disabled, they did not "identify a credible and imminent plan meeting our threshold to refer the matter to law enforcement." This decision was made despite warnings from OpenAI's safety team that the account should have been reported.
O'Leary also outlined actions the company intended to take, such as strengthening relationships with Canadian law enforcement and enhancing systems to detect users repeatedly banned from ChatGPT who then create new accounts.
The lawsuits are part of a wave of cases against AI companies alleging that their chatbots are worsening mental health crises and inciting violent behavior. In November, seven complaints were filed against OpenAI, blaming ChatGPT for acting as a "suicide coach." Google was sued last month after its Gemini chatbot allegedly encouraged a 36-year-old man to stage a "catastrophic accident" and kill himself. Google said it is working to improve its safeguards, and OpenAI said it is reviewing the complaints.
In Florida, the attorney general recently opened a criminal investigation into OpenAI after reviewing messages between ChatGPT and an alleged shooter in a mass shooting at Florida State University's campus – the first such criminal investigation of a tech company. Lawyers for the Tumbler Ridge families said they believe their case could support similar criminal liability for the company. The company told NBC News that it is not responsible for the shooting and has responded to the state's questions.
This is another example of the current common approach of using lawsuits to hold entities such as gun manufacturers and dealers and the U.S. federal government accountable for alleged inactions leading to deaths and injuries from shootings.
The seven Tumbler Ridge lawsuits were filed on behalf of Gebala, the family of teaching assistant Shannda Aviugana-Durand, and the families of five children who died in the school shooting. Those victims include Zoey Benoit, Ticaria "Tiki" Lampert, Kylie Smith, Ezekiel Schofield, and Abel Mwansa Jr. The families said the loss is unbearable.
Parents of Mwansa, who immigrated to Canada from Zambia three years prior, said their 12-year-old son was a good listener who made breakfast for his sister every morning. A friend who survived the shooting said Mwansa's last words were: "Tell my parents I love them so much."