Bullied teen killed herself in front of family - CNN Video

Character.ai Issues Apology After Teen Commits Suicide Over AI Chatbot Obsession

Bullied teen killed herself in front of family - CNN Video

In February, a 14-year-old teen from Florida committed suicide after forming a deep attachment with an AI character on the Character.ai platform. The tragic incident raises serious questions about the role of AI in society and the promise of virtual companionship.

The New York Times reports that the ninth grader started interacting with a chatbot named “Dany”, modeled on Daenerys Targaryen from the series “Game of Thrones”. He frequently shared personal information and role-played with the AI character, often indulging in romantic or sexual conversations.

The obsession grew so much that he preferred interacting with the AI character over real people, and his schoolwork suffered, as a result. After observing these behavioral changes, his parents took him to a therapist. He was diagnosed with anxiety and disruptive mood dysregulation disorder.

In his journal, he wrote, “I like staying in my room so much because I start to detach from this ‘reality,’ and I also feel more at peace, more connected with Dany and much more in love with her, and just happier.

Later, the teenager expresses feelings of self-hatred and emptiness and tells the AI character ‘Dany’ that he loves her and will soon “come home” to her. He then ended his life.

Now, the teenager’s mother, Megan L. Garcia, has filed a lawsuit against Character.ai, holding the company accountable for her son’s death. Character.ai has offered its deepest condolences to the family calling it the “tragic loss of one of our users.

Character.ai has more than 20 million users, and most of them are young. The company says it takes the safety of users seriously. It has developed a pop-up that takes the user to a suicide prevention hotline whenever self-harm-related keywords are detected. However, this safety feature was not deployed when the teenager ended his life.

Character.ai allows minors who are at least 13 to use its services in the US. In fact, the service markets itself as a one-stop platform where you can “feel alive” and chat with an AI “Psychologist” and discuss life problems.

This particular case raises serious questions about AI companions and its impact on young users. We hope the lawsuit leads to stringent safety guardrails on AI platforms.

NYT Strands Today: Hints, Answers & Spangram For October 18
Jule’s RNG Codes (October 2024)
Anime Adventures Codes (October 2024)

Bullied teen killed herself in front of family - CNN Video
Bullied teen killed herself in front of family - CNN Video
British Ruling Pins Blame on Social Media for Teenager’s Suicide - The
British Ruling Pins Blame on Social Media for Teenager’s Suicide - The
Empathy goes a long way for teens struggling with mental health
Empathy goes a long way for teens struggling with mental health