Quick Answers, Lost Connections: The Hidden Cost of AI in Student Networking
García Mathewson, T. (2025, July 16). Students increasingly rely on chatbots, but at what cost? CalMatters. Retrieved from https://calmatters.org/education/higher-education/2025/07/chatbots/
Introduction
As the appeal of AI assistance grows among students, the risk of eroding vital human connections deserves urgent attention. A recent article by Tara García Mathewson (2025) examines the surge in student usage of AI chatbots—such as ChatGPT, Gemini, and Claude—for academic support and brainstorming. It argues that although these tools offer efficiency for time‑pressed learners, they carry the unintended cost of reducing relational engagement with professors, peers, and advisors. Through student anecdotes and expert commentary, the article explores how the convenience of AI may diminish social capital critical for student retention and career advancement.
Findings
The author describes how students increasingly prefer chatbots for tasks like idea generation and overcoming writer’s block, rather than engaging with human interlocutors. She recounts student perspectives, including Christian Alba’s admission that he uses ChatGPT as a starting point but worries about over‑reliance. Expert voices from scholars such as Julia Freeland‑Fisher and Jean Rhodes underline that each chatbot‑mediated interaction is one fewer opportunity to build relationship ties. These ties, both strong and weak, are recognized in social capital theory as major contributors to student success, graduation rates, and future opportunities. Additionally, evidence from a joint MIT Media Lab and OpenAI study links frequent chatbot usage with increased feelings of loneliness and decreased social engagement.
Discussion
These insights are situated within theoretical frameworks of social capital and educational retention, highlighting a tension between AI’s utility and its potential to displace meaningful human contact. While chatbots can scaffold initial academic effort, they may inadvertently stunt development of relational skills, emotional resilience, and network building. Experts caution that embedding AI deeply into student support systems without deliberate safeguards may undercut long‑term benefits of belonging and mentorship.
Implications for Mentoring Programs
Mentoring initiatives should proactively position AI tools as complements rather than substitutes for human interaction. Programs might incorporate reflective prompts about AI usage, encouraging mentees to seek live guidance when facing academic or life decisions. Mentors can help mentees develop help‑seeking confidence, relational resilience, and network awareness. Integrating AI literacy into training can reinforce critical evaluation while preserving mentoring as central to student development. By framing technology as a tool rather than a replacement, mentoring programs can protect social capital and ensure that efficiency does not come at the expense of human connection.
Read the full article here


