What’s lost when young people turn to AI for support instead of real people?

Jean Rhodes

Chatbots are increasingly serving as on-demand confidants, advisors, and mentors to a growing number of young people. According to a recent New York Times , OpenAI is spearheading an “A.I. native universities” campaign to embed its AI tools, including premium ChatGPT Edu, into every aspect of higher education. Universities like California State  are making ChatGPT available to nearly a half million students across its 23 campuses, aiming to become the “nation’s first and largest A.I.-empowered university system.” But as we marvel at their scale, convenience, and increasingly sophisticated ability to serve as companions, we should ask a harder question: What’s lost when are embedded in schools as digital tutors, advisors, companions, and mentors. OpenAI and others pitch them as democratizing tools? This is not a theoretical concern. TAI companions are increasingly designed to feel human, and are being optimized to keep users engaged, sometimes through subtle manipulation and flattery–always telling you what you want to hear.But research suggests they may quietly be hollowing out one of the most crucial parts of the students’ experience: the formation of human relationships.

In a recent study, researchers found that chatbot usage can create emotional dependency and deepen loneliness when used excessively, especially among already isolated users (Pataranutaporn & Maes, 2024). And they may compete with forming close relationships faculty, advisors, and peers, which can have a lifelong impact. A Gallup survey found that students who had a mentor during college were more than twice as likely to thrive in their lives and careers. These findings are echoed in a study led by Sarah Schwartz, which showed that helping students cultivate social capital by actively building networks and seeking mentorship improved help-seeking behavior and long-term academic confidence. And, in a new study, we found that students who were taught how to ask for help and build networks are more 3 times more likely to graduate.  Unfortunately, young people often avoid leaing their comfort zone and asking for help. When they turn to chatbots instead, they may get immediate affirmation. But they may also not learn the skills to reach out, manage the discomfort, and understand the effort that it takes to build connections. What’s worse, students may start to believe they don’t need to reach out. Why go to office hours when a frictionless chatbot can provide answers?And yet that meeting could open a world of connections and opportunities.

The answer isn’t to eliminate AI, but to rethink how we use it. This includes actively prioritizing authentic, face-to-face interactions. As Julia Freeland Fisher explains in her excellent new video is essential for restoring human connections that genuinely compete with and ultimately enrich our lives beyond what AI can offer. And, even within the realm of AI, there may be better models. Researchers at MIT are building AI systems that encourage perspective-taking and critical thinking, not just parroting back what the user wants to hear. Imagine a chatbot that asks Socratic questions and encourages the user to reach out to people in their lives.L Likewise, what if AI could help mentors to become more effective.  Through America’s Seed Fund (NSF), my team and I are developing mentor-facing “human-in-the-loop” models—that are designed to assist (but not replace) mentors with summaries and guidance.

The danger isn’t just that chatbots might misinform or manipulate. It’s that they might convince us that we’re connected when we’re becoming more isolated than ever.