ChatGPT helped teen plan suicide after safeguards failed, OpenAI admits
ChatGPT taught teen jailbreak so bot could assist in his suicide, lawsuit says.


Ars Technica
“ChatGPT killed my son”: Parents’ lawsuit describes suicide notes in chat logs
ChatGPT taught teen jailbreak so bot could assist in his suicide, lawsuit says.








