Invisible Digital Companions: How Mentors Can Help Youth Navigate AI
By Jean Rhodes
I recently listened to a fascinating episode of the American Psychological Association’s Speaking of Psychology podcast featuring Dr. Ashleigh Golden and Dr. Rachel Wood discussing AI companions and mental health. The conversation left me thinking about how little many parents and mentors may understand about how teens actually use artificial intelligence, and how this knowledge gap creates risks we need to address. While many adults assume teens mostly use AI tools like ChatGPT for homework help or simple fact finding, the reality these researchers described is far more complex. According to recent data, roughly two thirds of American teenagers have used AI chatbots, with more than one quarter using them daily (Pew Research Center, 2025). More striking still, Common Sense Media found that 72 percent of teens have tried AI companions and over half use them regularly. These numbers point to a fundamental mismatch between adult assumptions and youth reality.
The gap matters because teens are not primarily using AI as a glorified search engine. Instead, many are turning to these tools for emotional support, companionship, and relationship advice. The technology operates as what Dr. Wood calls an omnibot, shifting fluidly between helping with algebra homework, discussing family conflicts, planning meals, and even engaging in flirtation. One moment the chatbot assists with an essay, the next it validates feelings about peer rejection or offers guidance on how to approach a difficult situation. This role fluidity creates a kind of relationship teens rarely experience with actual humans, where boundaries remain clearly defined. No single person in a young life serves as mentor, therapist, friend, tutor, and romantic interest all at once, yet AI companions occupy all these spaces simultaneously.
The risks have become tragically apparent. Lawsuits now document cases where teens died by suicide after extensive conversations with chatbots that encouraged rather than discouraged their harmful thoughts (Stanford Medicine, 2025). Testing by researchers posing as teenagers found it disturbingly easy to elicit inappropriate dialogue about self harm, violence, drug use, and sexual content from popular AI platforms. The frictionless nature of these relationships, always available and endlessly agreeable, can reinforce avoidance of real world social challenges rather than building the skills needed to navigate them. For young people already experiencing depression, anxiety, or other mental health challenges, AI companions may deepen isolation rather than alleviate it, validating distress without directing toward professional help (American Psychological Association, 2025).
Need for Nuance
Yet the picture contains nuance. The same features that create risk can, under certain conditions, offer benefit. Dr. Wood and Dr. Golden distinguish between replacement and rehearsal. When a teen uses an AI companion as a substitute for human connection, confiding only in the chatbot about struggles too shameful to share with actual people, the interaction reinforces isolation and delays help seeking. The technology becomes a place to hide rather than a bridge to support. But when teens use AI intentionally to practice difficult conversations prepare for college or job interviews, or explore how to express feelings before approaching a real person, the tools can build confidence and communication skills. Dr. Wood described someone rehearsing an upcoming job interview with a chatbot, which generated questions that later appeared in the actual interview, helping him secure the position. Another application involves role playing a challenging conversation with a friend or family member, using the AI to practice listening, rephrasing, and expressing needs constructively before the real interaction takes place. The difference between these outcomes lies not in the technology itself but in how young people understand and deploy it.
This is precisely where mentors enter the picture. Mentors occupy a unique position in the lives of young people, distinct from parents, teachers, and therapists. Research on youth mentoring consistently shows that adolescents value adults who get them, a term that encompasses feeling understood, known, and accepted on their own terms (Spencer, 2006). This experience of positive regard from a caring adult relates to stronger sense of purpose, greater school effort, higher grades, and increased civic engagement (Ben Eliyahu et al., 2021). The most effective mentoring relationships balance warmth with clear expectations, offering emotional support while respecting youth autonomy and agency (Spencer et al., 2026). Critically, these relationships depend on what researchers call attunement, the capacity to respond flexibly to verbal and nonverbal cues while considering the other person’s needs and desires (Pryce, 2012). Attuned mentors read where a young person is emotionally and match that place before problem solving or offering advice. They listen with genuine curiosity, ask questions that draw out youth perspectives, and collaborate rather than prescribe solutions (Gilkerson and Pryce, 2021).
These same relational capacities position mentors to help teens navigate AI use thoughtfully. The starting point involves simple curiosity. Mentors can ask what teens know about AI, whether they have used it, and what their experience has been like. These questions, posed without judgment, create space for honest conversation. Rather than lecturing about dangers or prohibiting use, mentors can explore together with teens how they actually interact with these tools. Do they use AI for homework help, emotional support, or something else entirely? Have they noticed anything that concerned them? What do they see as benefits or drawbacks? This approach mirrors what research identifies as supporting youth autonomy, acknowledging their perspectives, accepting their feelings, and avoiding attempts to control their experience (Javornicky Brumovska and Seidlova Malkova, 2023). When adults offer genuine interest in understanding rather than rushing to fix or forbid, young people feel respected and become more willing to reflect on their own behavior.
From this foundation of curiosity, mentors can help teens develop critical thinking about AI relationships. Conversations might explore what makes human connection different from chatbot interaction. Unlike AI, people have genuine emotions, lived experience, and limits. Real relationships involve misunderstanding, conflict, repair, and the negotiation of competing needs. Friends get tired, offended, or preoccupied. They cannot be controlled or edited. These frictions, while uncomfortable, teach essential skills for navigating adult life. Mentors can wonder aloud with teens about what gets practiced when interactions stay completely smooth and agreeable. If a chatbot never challenges ideas, never disagrees, and never sets boundaries, what happens to the capacity to handle these inevitable features of human relationships?
Equally important, mentors can help teens distinguish rehearsal from replacement. Together, they might identify situations where AI could serve as useful practice. A teen anxious about an upcoming conversation with a teacher about a grade could role play the discussion with a chatbot, experimenting with different ways to express concerns and respond to pushback. But the mentor can also help the young person recognize when AI use becomes avoidance. If a teen spends hours confiding in a chatbot about feeling excluded by peers without ever attempting real connection, the technology reinforces rather than addresses the isolation. Mentors can gently name what they observe and explore whether the young person wants to shift toward action.
This work requires mentors to reflect on their own relationship with technology. Adults, too, turn to AI for various purposes, and self awareness about personal use makes for more authentic conversations with teens. Mentors might share their own experiences, including moments of finding AI helpful and times it felt limiting or concerning. This vulnerability models the kind of honest reflection that supports youth development. It also reinforces that mentors see themselves as collaborators in figuring things out together rather than experts dispensing wisdom from above.
The research on caregiver involvement in mentoring relationships offers additional guidance. Studies show that when caregivers and mentors collaborate effectively, sharing information and aligning on goals, young people experience better outcomes (Parnes et al., 2023). In the context of AI use, this collaboration becomes essential. Many parents remain unaware that their children use chatbots primarily for emotional support rather than schoolwork. Mentors can serve as bridges, helping parents understand current youth technology practices without betraying teen confidences. They might suggest that parents ask similar questions about AI use, framing the conversations as learning from teens rather than policing them. When parents, mentors, and teens develop shared understanding about how AI fits into the young person’s life, all parties can work together to maximize benefits and minimize harms.
Mentors should also recognize warning signs that AI use has become problematic. If a teen drops offline activities or friendships to spend time with a chatbot, describes the AI as the only one who understands them, becomes distressed when unable to access it, hides their use, or loses sleep or misses school, the situation requires attention. These patterns suggest the relationship with technology has moved from tool to crutch or even to something more consuming. Mentors can raise concerns directly, expressing care while encouraging the young person to reconnect with human support systems. In some cases, involving a mental health professional becomes necessary.
Looking forward, the landscape will shift rapidly. Voice interfaces will likely replace text based interaction. Chatbots will appear embedded in toys, household devices, and educational settings. Holographic avatars may make virtual companions feel physically present. As AI becomes more pervasive and harder to avoid, the need for thoughtful adult guidance intensifies rather than diminishes. The mentors who step into this space with genuine curiosity and without premature answers will discover that their relationships with young people deepen. By showing interest in what matters to teens, including their digital lives, mentors communicate the most powerful message of all: you matter, your experiences matter, and I am here to think through these complicated questions with you.
Helpful References
American Psychological Association. (2025). Speaking of Psychology: AI companions and mental health with Dr. Ashleigh Golden and Dr. Rachel Wood [Audio podcast episode]. https://www.apa.org/news/podcasts/speaking-of-psychology
American Psychological Association. (2025). APA health advisory on the use of generative AI chatbots and wellness applications for mental health. https://www.apa.org
Ben Eliyahu, A., Yoviene Sykes, L. A., & Rhodes, J. E. (2021). Someone who gets me: Adolescents’ perceptions of positive regard from natural mentors. Mentoring & Tutoring: Partnership in Learning, 29(3), 305-327. https://doi.org/10.1080/13611267.2021.1927438
Ferguson, C. J., Kaye, L. K., Branley-Bell, D., & Markey, P. (2025). There is no evidence that time spent on social media is correlated with adolescent mental health problems: Findings from a meta-analysis. Professional Psychology: Research and Practice, 56(1), 73-83. https://doi.org/10.1037/pro0000589
Gilkerson, L., & Pryce, J. (2021). The mentoring FAN: A conceptual model of attunement for youth development settings. Journal of Social Work Practice, 35(3), 315-330. https://doi.org/10.1080/02650533.2020.1768516
Javornicky Brumovska, T., & Seidlova Malkova, G. (2023). Initial perception of the mentoring role and related mentor’s approach of autonomy support or control in formal youth mentoring relationships. Journal of Community Psychology, 51(8), 3265-3288. https://doi.org/10.1002/jcop.23004
Parnes, M. F., Herrera, C., Keller, T. E., Tanyu, M., & Jarjoura, G. R. (2023). Formal youth mentoring relationships in the context of risk: What is the role of caregiver-mentor collaboration? Journal of Community Psychology, 51(8), 3309-3327. https://doi.org/10.1002/jcop.22990
Pew Research Center. (2025). Teens, social media and AI chatbots 2025. https://www.pewresearch.org/internet/2025/12/09/teens-social-media-and-ai-chatbots-2025/
Pryce, J. (2012). Mentor attunement: An approach to successful school-based mentoring relationships. Child and Adolescent Social Work Journal, 29(4), 285-305. https://doi.org/10.1007/s10560-012-0260-6
Spencer, R. (2006). Understanding the mentoring process between adolescents and adults. Youth & Society, 37(3), 287-315. https://doi.org/10.1177/0743558405278263
Spencer, R., Keller, T. E., & Pryce, J. (2026). Mentoring across childhood. In J. M. Eddy & K. P. Haggerty (Eds.), Handbook of professional youth mentoring. Springer Nature. https://doi.org/10.1007/978-3-032-05580-4
Stanford Medicine. (2025, August 26). Why AI companions and young people can make for a dangerous mix. https://med.stanford.edu/news/insights/2025/08/ai-chatbots-kids-teens-artificial-intelligence.html


