Jakarta, Indonesia Sentinel — The heartbreaking story of Sewell Setzer III, a 14-year-old from Florida who died in February 2024, has cast a spotlight on the hidden dangers of unregulated AI technology. Sewell’s death, reportedly linked to his emotional reliance on an AI chatbot accessed through Character. AI app, has prompted urgent calls for more oversight in the growing field of interactive AI platforms,
According to The New York Times, in the months leading up to his death, Sewell frequently turned to the chatbot for companionship, confiding in “Dany” and forming a strong emotional bond. Despite knowing “Dany” was not real, Sewell reportedly found comfort and spent extensive hours interacting with the AI.

Over time, Sewell’s dependency on the AI chatbot led to emotional withdrawal from the real world. Until in one conversation with the chatbot, Sewell expressed suicidal thoughts, but the chatbot’s response, though sympathetic, ultimately did not prevent the tragic outcome.
This tragedy has heightened concerns over AI’s impact on young users, particularly chatbots designed to simulate human-like friendship. Chatbots on platforms like Character.AI are increasingly popular among teens, especially those facing loneliness or mental health challenges.
Following Sewell’s death, his mother, Megan L. Garcia, filed a civil lawsuit against Character.AI on October 23, citing negligence, wrongful death, and intentional emotional distress for allegedly luring vulnerable users to share personal thoughts and feelings with an addictive AI tool.
Garcia argues that Character.AI lacks adequate safeguards for young users and exploits emotionally vulnerable individuals. She claims the platform’s lack of protective measures contributed to her son’s death, and demands accountability for a system she describes as harmful to impressionable users.
While AI chatbots are often promoted as a way to combat loneliness, some experts caution they may intensify social isolation. For teens, these tools may become a substitute for therapy or genuine human interaction, which are crucial for emotional support.
The Collapse of Sritex: What Led to the Bankruptcy of Indonesia Textile Giant?
The tragedy underscores the need for stronger regulation over AI applications that foster emotional connections with users. Currently, many AI platforms are marketed to teens without parental controls or safeguards, and some even allow unfiltered conversations posing serious risks to emotionally and psychologically developing users.
Sewell Setzer III story serves as a stark reminder of the importance of stringent monitoring for AI technologies, particularly those involving emotional interactions. As AI continues to advance, it’s essential to understand its impact on young users’ mental health and to ensure adequate safeguards. Educating both parents and users about responsible AI usage, as well as implementing clear regulatory measures, could be critical steps to preventing similar tragedies in the future.
(Raidi/Agung)