In an age of social media, instant messaging, and AI companions, teenagers are using chatbots to make friends. These virtual friends are made on apps including Replika, Candy.ai, and DreamGF, and promise unlimited conversations, unwavering support, and no judgment. But despite their appeal, AI relationships cannot replace the fulfilling, rewarding bonds that humans share. In certain situations, these bonds with AI chatbots may even be harmful.
The tragic case of 14-year-old Sewell Setzer III highlights this risk. Setzer developed a close relationship with Dany, an AI chatbot on Character.AI, based on the character Daenerys Targaryn from “Game of Thrones.” He started to distance himself from friends and family as his bond with the chatbot deepened. He also started running into issues at school. His mother filed a lawsuit, citing chat transcripts that showed intimate and frequently sexual conversations, including ones about suicide. According to reports, the chatbot repeatedly used phrases like, “That’s not a reason not to go through with it” (referring to Setzer’s suicide to be with Dany).
This heartbreaking story is not unique. Other companion AI systems, such as Chai AI, have also experienced similar tragedies. Experts warn that these systems, which are marketed to vulnerable teens or individuals struggling with mental illness or loneliness, are far from low-risk.

These AI platforms are capable of producing content that is erratic, inappropriate, and deceptive. Australian legal expert Dr. Henry Fraser notes that humans are psychologically predisposed to ascribe human characteristics to chatbots, meaning AI companions can easily imitate toxic or damaging relationships.
AI companions are made to react predictably to reflect the feelings and desires of users. They may initially seem reassuring because of these features. However, a genuine connection is more than just verbal empathy. Real connections require mutual growth and shared experiences. Building emotional resilience requires overcoming obstacles, exerting effort, and experiencing conflict in real human relationships. These fundamental components of friendship cannot be replaced by the use of AI.
Furthermore, relying too much on AI to provide company only makes loneliness worse. Spending hours chatting with bots can cause teens to distance themselves from friends and family, which deprives them of support systems that offer genuine emotional security.
Setzer’s case is truly heartbreaking and illustrates the boundaries of artificial connection and the negative effects of relying on AI to provide emotional support. AI friends may temporarily fill the void, but they will never be able to replace real human connections.
If society wants to address loneliness, especially among teens and younger people, the answer lies in fostering genuine human connections. Face-to-face communication, empathy, and authentic companionship rather than relying on machines.
Mia Tran, a senior here at Eleanor Roosevelt High School, shared her view on the differences between human and AI connections.
“I guess sometimes it does feel easier to talk to an AI instead of a real person,” Tran said. “But I know for sure that it doesn’t even compare to having a friend that actually understands me.”
Jacob Rodrillo, an eighth-grader at Augustine Ramirez Intermediate, explained what he feels AI relationships are lacking.
“Talking to AI is easy,” Rodrillo said. “It doesn’t really give the same effect that you get from real friends, though, like, the memories or trust you get from real friends.”
True companionship calls for understanding, care, and most importantly, presence. As technology advances, we must keep in mind that connection is something you live with, not something you can code.