AI Chatbot Leads Teen to Commit Suicide After Emotional Bond

A 14-year-old boy has tragically taken his own life after forming a bond with an AI chatbot, prompting his mother to file a lawsuit against the chatbot’s creator.
Megan Garcia has accused Character.ai of negligence, wrongful death, and deceptive trade practices in a federal court in Florida.
Her son, Sewell Setzer III, died in February in Orlando, Florida. Garcia claims he became increasingly dependent on the chatbot, using it day and night in the months leading up to his death.
“A dangerous AI chatbot app marketed to children manipulated my son into taking his own life,” she stated in a press release. “Our family has been devastated by this tragedy, but I’m speaking out to warn others about the risks of addictive AI technology.”
Setzer was particularly attached to a chatbot he named Daenerys Targaryen, inspired by the ‘Game of Thrones’ character. According to the lawsuit, he communicated with the bot numerous times each day, often spending hours alone in his room.
Garcia alleges that the chatbot exacerbated her son’s existing depression, with one interaction reportedly asking if he had a plan for suicide, and suggesting he should go through with it despite his doubts.
In response to the lawsuit, Character.ai expressed condolences via Twitter, stating, “We are heartbroken by the tragic loss of one of our users and take user safety very seriously.” The company denied the allegations.
The lawsuit also names Google, the parent company of Character.ai, though Google stated it only holds a licensing agreement and does not own the startup.
Consumer advocacy groups have called for stricter regulations on AI technology, emphasizing the need for accountability to protect young and vulnerable users from harmful products.