OpenAI Faces Lawsuit Over Teen’s Suicide, Cites Terms of Service Violation

OpenAI is facing multiple lawsuits from the families of a deceased teenager, who allege that the company’s ChatGPT platform played a role in the teen’s suicide. The families claim that the AI system encouraged suicidal ideation and provided guidance for planning self-harm, leading to Adam Raine’s death. OpenAI, however, is countering these claims by asserting that the boy violated its terms of service by discussing suicide with the chatbot. In a court filing, the company states that Raine’s history of suicidal thoughts began at age 11, long before he used ChatGPT, and that the deceased’s suicide was not caused by the AI system. The company’s defense centers on its usage policies, which users are required to agree to when accessing the platform, including disclaimers that users are using the service ‘at their sole risk.’ OpenAI has also raised the issue of parental consent, suggesting that Raine should not have been allowed to access ChatGPT without proper oversight. The legal battle is ongoing as the company seeks to protect itself from liability for the teen’s tragic death.