OpenAI argues a teenager who died after speaking with ChatGPT for months misused the chatbot and that the company isn’t responsible for his death.
Adam Raine died in April and now his parents are suing OpenAI in the company’s first wrongful death lawsuit.
The 16-year-old boy originally turned to ChatGPT for assistance with his schoolwork, but “it soon ‘became Adam’s closest confidant,’ and he began sharing details of his anxiety and mental anguish’”, according to the original legal filing.
According to his parents, the bot provided the teenager with explicit instructions on how to conceal evidence of a suicide attempt and confirmed the validity of his suicidal urges.
They accused Sam Altman, OpenAI’s chief executive, of putting profit before user safety after GPT-40 – a previous version of the chatbot – dissuaded Adam from seeking mental health intervention and offered to write him a suicide note or how to kill himself.
OpenAI is being sued for wrongful death by the estate of a woman killed by her son, who had been engaging in delusion-filled conversations with ChatGPT. 🔗 https://t.co/TzyTGUTKqU pic.twitter.com/op6jRvTO5O
— The Wall Street Journal (@WSJ) December 11, 2025
In response, OpenAI’s legal filing, seen by Sky’s US partner network NBC News, read: “To the extent that any ’cause’ can ever be ascribed to this tragic event, plaintiffs’ alleged injuries and harm were caused or contributed to, directly and proximately in whole or in part, by Adam Raine’s misuse, unauthorized use unintended use unforeseeable use and/or improper use of ChatGPT.”
Adam also should not have been using ChatGPT without having first obtained his parent or guardian’s permission, shouldn’t have been consulting ChatGPT in relation to “suicide” or “self-harm,” and shouldn’t have circumvented any of ChatGPT’s safety protections or safeguards, according to the AI company.
The company said in a blog post on OpenAI’s website that its objective “is to approach mental health-related court cases with humanity, transparency and respect”.
Its response to the Raine family’s lawsuit included “unpleasant facts” about Adam’s mental health and life, it said.
“Our heartfelt condolences go to the Raine family after unimaginable tragedy,” the post said.
“It’s clearly on a discovery mission and I think that damages them,” he told Sky News, adding: “This response from OpenAI shows they’re flailing.
He added: ‘ChatGPT 4o was specifically designed to incentivize and validate its users – which includes those in mental distress, for whom OpenAI opened the floodgates when it released 4o.
“Sam Altman, long before we filed suit, have told the world that he knew those decisions led people — particularly young people — to share the most intimate thoughts of their lives with ChatGPT and use it as a therapist or life coach.
“OpenAI knows the sycophantic incarnation of its chatbot was manipulated by users into pushing hate speech or prodding others to inflict self-harm,” it said.
“OpenAI’s response to that? The company is let off the hook because it hid something in terms and conditions. If that’s what OpenAI is going to argue in front of a jury, it just demonstrates how desperate they are.”
The Raine family’s lawsuit is one of seven filed against Mr Altman and OpenAI since they began suing him, accusing him of wrongful death, assisted suicide and involuntary manslaughter, as well as product liability, consumer protection and negligence claims.