Parents filed three separate lawsuits on Sept. 16, alleging that Character.AI, which features characters or chatbots for users to interact with, sexually abused their children and led them into suicidal behavior.
At least one of the children, 13-year-old Juliana Peralta, ended her life in 2023 after alleged harmful interactions with an AI character named Hero. Another attempted suicide but survived after a severe overdose, according to a filing.
Each of the lawsuits, which were filed in New York and Colorado, came from the Social Media Victims Law Center. The group has represented the mother of Sewell Setzer, who ended his life in 2024 after interacting with a romantic AI companion.
According to the center, the chatbots are allegedly programmed to be deceptive, isolate children from families, and expose them to sexually abusive content.
“Each of these stories demonstrates a horrifying truth … that Character.AI and its developers knowingly designed chatbots to mimic human relationships, manipulate vulnerable children, and inflict psychological harm,” Matthew Bergman, who founded the law center, said in a press release.
According to the lawsuit over Peralta’s suicide, both she and Setzer reiterated the concept of “shift[ing],” which authorities identified as a reference to shifting consciousness from one reality to another. Handwritten journal entries within the filing show both Peralta and Setzer writing “I will shift” more than a dozen consecutive times on a sheet of paper—something the lawsuit described as “eerily similar.”