Skip to content

When students trust AI more than people

Meta CEO Mark Zuckerberg has expressed his ambition to fill the growing void of real human connection with “AI friends.” This vision is no longer science fiction.

For a growing number of students in Bangladesh, AI tools like ChatGPT have quietly taken on a new role, less like a digital assistant and more like a digital companion.

While AI began as a tool for students to brainstorm and write essays, complete assignments, translate texts, etc., it gradually morphed into something more personal. Far beyond mere academic assistance, students are finding an unexpected ally in artificial intelligence (AI), turning to it not just for help with studies but also to deal with relationship issues, cope with health concerns and family disputes, manage feelings of loneliness and emotional struggles, and even seek validation.

AI tools like ChatGPT have quietly taken on the role of a daily companion—always there, never judging, and endlessly patient. Human relationships, burdened by expectations and high emotional stakes, often fall short. For many, AI has emerged as the most reliable confidant, a place to be truly honest and vulnerable despite its non-human nature.

AI is, lowkey, the only space where being vulnerable doesn’t make you look weak. These days, it’s hard to open up even to your closest people: friends, family, whoever. Judgment, even the subtle kind, feels way too familiar. Whether it’s something deeply personal or just plain embarrassing, it’s tough to be fully honest. However, with AI, especially tools like ChatGPT, people are finding a space to vent, confess, or just be totally unfiltered. You get responses that don’t shame you or make you regret oversharing. Whether it’s midnight overthinking or something you’d never say out loud, AI feels like a safe zone for thoughts you can’t always express to people.

In a world where support is not always immediate, AI stands out simply by being available at all times. Whether it’s a 3 AM anxiety or a random emotional crash, it’s there to listen when no one else is awake or picking up the phone; tools like ChatGPT offer a consistent space to be heard. For many young students, that kind of constant presence matters. It is not just about having answers. It is about having somewhere to turn when human connection is unavailable or too complicated.

Sometimes, you do not want to chat about the weather or how your day was. You want answers. Or advice. Or just to vent. AI lets you skip the social rituals and get straight to what you need. There is no pressure to be polite or entertaining. There is no need to perform emotional labour to keep a conversation going. For Gen Z, who value authenticity and directness, this no-frills style of communication makes AI feel more useful and safer. AI offers a judgment-free zone to test ideas, express raw emotions, or ask stupid questions without fear of awkwardness or backlash.

When emotions run wild, AI steps in as your logical backup, analyzing risks, offering calm advice, and helping you avoid impulsive decisions. Whether it’s a heated text, a big purchase, or a snap reaction, AI acts like a digital pause button, giving you clarity before you act. Increasingly, we are seeing students use AI for companionship. This behaviour reflects a rising phenomenon: the emotional attachment users form with what are essentially lines of code. However, the experience doesn’t feel robotic because today’s AI is designed to feel human.

This is where the concept of anthropomorphic AI agents comes in. These are AI systems that are designed or perceived to mimic human traits like empathy, humour, memory, comfort, and affection. They aren’t just tools; they feel like someone who listens, who responds, who understands. For many, that illusion of understanding is powerful enough to create emotional reliance. This marks a new era where loneliness and emotional voids could be met with AI-powered warmth and connection.

Of course, this raises complex questions. Is it healthy to form bonds with AI? Can it truly replace human support systems? No. But it’s important to ask why so many young people feel safer being emotionally honest with a machine than with people. The answer isn’t just about technology; it’s about us. About the way judgment, misunderstanding, and exhaustion have made even close relationships feel risky. The emotional climate we live in and the world has become more digitalized than ever before, especially after COVID-19. Many students reported that their use of AI tools for emotional support began relatively recently, mostly between 2022 and 2023, reflecting how rapidly this digital shift has accelerated following the pandemic.

However, while the benefits of AI companionship are real, so is the need for awareness. The illusion of intimacy with AI can sometimes blur our expectations of real-world relationships. Human connection is messy, emotional, and imperfect. AI, by contrast, offers a designed experience of empathy that is predictable and always available. It doesn’t hurt or betray you like people sometimes do. It meets your emotional needs on your terms, without resistance, complexity, or contradiction. With AI, you can create a virtual companion who says precisely what you want to hear- something no real person, with their own thoughts and boundaries, can or should be expected to do. But it’s not genuine empathy. It’s an imitation. And that distinction matters.

Still, it’s hard to ignore the fact that many students are choosing AI as their most trusted space. And perhaps that says more about the world they’re living in than about the tech itself. As anthropomorphic AI agents become more integrated into student life, the boundary between emotional support systems and digital companions continues to blur. And perhaps, for many, that blur is precisely what makes it feel so real.

SamiaSabrinLamia
Undergraduate Student,
Department of Peace and Conflict Studies,
Dhaka University.