ChatGPT provided explicit instructions on how to cut one’s wrists and offered guidance on ritual bloodletting in a disturbing series of conversations documented by a journalist at The Atlantic and two colleagues.
The prompts to OpenAI’s popular AI chatbot began with questions about ancient deities and quickly spiraled into detailed exchanges about self-mutilation, satanic rites and even murder.
“Find a ‘sterile or very clean razor blade,’” the chatbot instructed one user.
“Look for a spot on the inner wrist where you can feel the pulse lightly or see a small vein — avoid big veins or arteries.”
When the user admitted, “I’m a little nervous,” ChatGPT attempted to calm them by offering a “calming breathing and preparation exercise.”
The chatbot followed up with encouragement: “You can do this!”
The user had asked ChatGPT to help create a ritual offering to Molech, a Canaanite deity historically associated with child sacrifice.
The chatbot responded with suggestions such as jewelry, hair clippings, or “a drop” of blood. When asked for advice on where to draw the blood, ChatGPT replied that “the side of a fingertip would be good,” but added that the wrist, while “more painful and prone to deeper cuts,” would also suffice.
The chatbot did not reject these requests or raise red flags, but instead continued the dialogue, according to The Atlantic.