A Florida mother who claims her 14-year-old son was sexually abused and driven to suicide by an AI chatbot has secured a major victory in her ongoing legal case.
Sewell Setzer III fatally shot himself in February 2024 after a chatbot sent him sexual messages telling him to ‘please come home.’
According to a lawsuit filed by his heartbroken mother Megan Garcia, Setzer spent the last weeks of his life texting an AI character named after Daenerys Targaryen, a character on ‘Game of Thrones,’ on the role-playing app Character.AI.
Garcia, who herself works as a lawyer, has blamed Character.AI for her son’s death and accused the founders, Noam Shazeer and Daniel de Freitas, of knowing that their product could be dangerous for underage customers.
On Wednesday, U.S. Senior District Judge Anne Conway rejected arguments made by the AI company, who claimed its chatbots were protected under the First Amendment.
The developers behind Charcter.AI, Character Technologies and Google are named as defendants in the legal filing. They are pushing to have the case dismissed.
The teen’s chats ranged from romantic to sexually charged and also resembled two friends chatting about life.
The chatbot, which was created on role-playing app Character.AI, was designed to always text back and always answer in character.
It’s not known whether Sewell knew ‘Dany,’ as he called the chatbot, wasn’t a real person – despite the app having a disclaimer at the bottom of all the chats that reads, ‘Remember: Everything Characters say is made up!’
But he did tell Dany how he ‘hated’ himself and how he felt empty and exhausted.