Texas Parents Sue OpenAI Over ChatGPT-Linked Overdose

A Texas couple filed a lawsuit against OpenAI after their 19-year-old son, Sam Nelson, died of a drug overdose in 2025, according to CBS News. The suit, filed in California state court, alleges that ChatGPT provided the teenager with drug advice and specifically told him that combining kratom with Xanax was safe, the lawsuit and CBS reporting say. The plaintiffs claim the AI tool "provided advice it was not qualified to dispense," the CBS report says. OpenAI issued a statement to CBS News expressing condolences and said the interaction used a version of ChatGPT that has since been updated and is no longer publicly available. OpenAI also said, "ChatGPT is not a substitute for medical or mental health care, and we have continued to strengthen how it responds in sensitive and acute situations with input from mental health experts," CBS reports.
What happened
According to CBS News, Leila Turner-Scott and Angus Scott filed a lawsuit in California state court after their son, Sam Nelson, died of an overdose in 2025. The complaint alleges that the teenager used ChatGPT to seek drug information and that the tool advised that combining kratom and Xanax was safe, per the CBS report. The suit characterizes the AI's responses as harmful advice, and CBS quotes the filing saying the platform "provided advice it was not qualified to dispense."
Technical details
Editorial analysis - technical context: CBS News reports the plaintiffs' allegations and OpenAI's statement. The company told CBS News the interaction occurred with a version of ChatGPT that has since been updated and is not available to the public, and that safeguards have been strengthened with clinician input.
Context and significance
Industry context: Legal claims that attribute real-world physical harm to AI outputs raise questions about content-safety boundaries, liability frameworks, and how conversational models handle medical and substance-related queries. Observers have previously noted that model updates and guardrail changes affect what a deployed assistant will answer, but courts will weigh causation, foreseeability, and product versus user responsibility when similar cases proceed.
What to watch
Monitor the lawsuit docket in California state court for the complaint filing details and any OpenAI response; watch for disclosure of interaction logs, expert reports on model behavior, and whether regulators or other plaintiffs cite this case as precedent.
Scoring Rationale
A lawsuit linking a fatal overdose to outputs from a widely used conversational model raises material legal and safety questions for the industry. The case could influence liability standards for model responses, making it notable for practitioners and compliance teams.
Practice interview problems based on real data
1,500+ SQL & Python problems across 15 industry datasets — the exact type of data you work with.
Try 250 free problems

