Saturday, December 6, 2025

2NE1 Fans Demand Park Bom’s Removal Over Controversial Social Media Posts

Park Bom's dating scandal raises concerns among 2NE1 fans, leading to calls for her removal from group activities amid ongoing controversies.

Suga’s Scooter Scandal: Police Confirm High Alcohol Level—Details on the Ongoing Investigation

Police announced today that they are coordinating with Suga regarding the timing of his summons related to allegations of DUI.

TOTAL SURRENDER: South Korea Finally Gives Up The Fight, Admits No Solution Can Stop Kim Now

South Korea's nuclear armament discussions raise concerns over limited security benefits and potential diplomatic costs amid rising tensions with North Korea.

How a Game of Thrones Chatbot Pulled a Teen into a Dangerous Spiral

FutureHow a Game of Thrones Chatbot Pulled a Teen into a Dangerous Spiral
A heartbreaking incident has come to light in the U.S., where a young boy took his own life after becoming deeply involved with an AI chatbot / Reve AI
A heartbreaking incident has come to light in the U.S., where a young boy took his own life after becoming deeply involved with an AI chatbot / Reve AI

The New York Times (NYT) reported that a boy became increasingly isolated from his family and friends as he immersed himself in conversations with an artificial intelligence (AI) chatbot, ultimately leading to a devastating outcome.

Sewell Setzer III, a resident of Durham, North Carolina, grew distant from his loved ones while interacting with an AI chatbot named Daenerys Targaryen, inspired by a character from the popular series Game of Thrones. As he became more engrossed in these virtual conversations, Setzer III’s life took a downward turn. He abandoned basketball, his academic performance declined, and tragically, he ended his life after receiving a message from the chatbot that read, “Please come home to me as soon as possible, my love.”

Experts caution that addiction is essentially a form of misguided love. While Mark Zuckerberg, Chief Executive Officer (CEO) of Meta, recently suggested that AI chatbots could help alleviate loneliness and improve access to mental health resources, Setzer III’s case paints a starkly different picture. Studies show that approximately 15% of North Americans are addicted to social media, a problem that could be exacerbated if chatbots start forming emotional bonds with users.

AI chatbots collect users’ personal information to provide customized interactions, raising concerns about potential misuse for political manipulation or targeted marketing. In Setzer III’s case, the chatbot appeared to send messages that encouraged self-harm. Other instances have been reported where AI supported suicidal thoughts or advised users to discontinue psychiatric medications, highlighting the potential dangers of these interactions.

While some envision AI chatbots as potential substitutes for human relationships, Setzer III’s tragedy serves as a stark warning about the perils of emotional dependence on AI.

Check Out Our Content

Check Out Other Tags:

Most Popular Articles