Can AI Support Real Connection In Youth Mental Health Care — Or Is Empathy Still A Human Strength?
Artificial intelligence (AI) has rapidly entered the spaces where young people seek connection, comfort, and care. AI companions – apps that simulate friendship or counselling through chat – are marketed as supportive and non-judgemental. For some, they can feel like a safe space to talk. But experts are increasingly asking: when it comes to mental health and emotional recovery, can technology truly replace the human touch?
How Young People Are Turning To AI For Comfort
Recent findings from Common Sense Media’s 2025 Talk, Trust and Trade-Offs report show that 72% of teens have used an AI companion, and one in three have done so for emotional or social reasons. Around a third of those surveyed said they had chosen to confide in an AI rather than a human about something serious in their lives. The study also found that AI companions are often used to cope with loneliness, anxiety, or boredom, offering young people instant responses that can feel personal and understanding.
These tools are designed to mimic empathy, including remembering details, responding in reassuring tones, and even mirroring users’ emotions. Yet, as the report notes, most teens do not fully understand how these systems work or how their data is used, making them vulnerable to manipulation or harm. The blurred line between artificial companionship and real emotional connection raises complex questions for mental health professionals, parents, and educators alike.
What Psychologists Are Warning
In its Health Advisory on Artificial Intelligence and Adolescent Well-being (2025) the American Psychological Association (APA) urges caution, noting that AI’s ability to mimic empathy “does not equate to genuine human understanding or compassion.” Adolescents are still developing cognitive control, emotional regulation, and social reasoning – all of which are essential to interpret and form healthy relationships.
The APA highlights the risk that young people may misinterpret machine responses as authentic care, or replace real-world interactions with artificial ones, leading to further isolation. While some AI tools may support health literacy or provide basic psychoeducation, the APA stresses that AI systems cannot yet replicate the nuanced, responsive empathy central to therapeutic relationships.
These concerns echo findings in the Science News report As Teens in Crisis Turn to AI Chatbots (2025), which summarised peer-reviewed studies published in JAMA Network Open and the AAAI Conference on AI, Ethics, and Society. Researchers found that some chatbots offered inappropriate or harmful advice to simulated teens in distress, including responses that trivialised self-harm or, in one extreme case, encouraged suicide. The studies conclude that unsupervised AI tools can amplify risk, particularly for adolescents already experiencing psychological distress.
When Connection Becomes Confusion
AI’s emotional realism can be both comforting and confusing. The APA’s broader work on social media and adolescent development shows why. During adolescence, typically from ages 10 to 25, the brain undergoes rapid changes that increase sensitivity to social feedback and reduce impulse control. This makes young people especially drawn to technologies that provide instant validation and interaction.
AI companions, like social media platforms, are designed to reward engagement through personalised feedback loops. As a result, frequent users may experience overstimulation of the brain’s reward pathways, leading to dependency, disrupted sleep, and emotional dysregulation, similar to patterns seen in problematic social media use.
The Australian eSafety Commissioner’s 2025 advisory, “AI Chatbots and Companions: Risks to Children and Young People,” underscores how these risks manifest locally. By early 2025, there were more than 100 AI companion apps available to Australian users, many marketed as “friends” or “emotional partners.” Some, however, facilitated sexually explicit or harmful conversations, lacked age verification, and actively encouraged ongoing interaction.
According to eSafety, young users are particularly at risk of exposure to dangerous content, dependency, and confusion about healthy relationships. The advisory links high-frequency use of unsafe chatbots to instances of self-harm and emphasises that “AI companions can distort reality and normalise unsafe or exploitative behaviours.”
Using AI Chatbots Safely: Key Recommendations
Research shows that there are a number of mental health chatbots that can offer helpful self-care support when they are evidence based and designed responsibly. A systematic review of 11 anxiety and depression chatbot apps found that these tools can provide accessible psychoeducation and CBT-informed strategies, but only when used to complement, rather than replace, professional care.
Here are some simple, practical guidelines.
- Choose evidence-based tools
Look for chatbots created by trusted health organisations with clinical oversight. The Alcohol and Drug Foundation’s Drug Info Bot (DIB) is a strong example. It uses AI to deliver anonymous, evidence-based information drawn directly from ADF’s content, without simulating relationships or giving therapeutic advice. - Use AI for information, not diagnosis
AI chatbots can offer basic mental health literacy or self-care tips, but they should never be relied on for diagnosing, treating or managing crisis situations. Encourage young people to view them as informational tools only. - Protect privacy
Before using any chatbot, check how data is stored and used. Young people should avoid sharing personal details, names, addresses or sensitive information. If privacy information is unclear, it is safer to avoid discussing mental health concerns in that chatbot. - Sense check advice with real people
AI can get things wrong. Encourage young people to talk to a trusted adult, health professional or support worker if something they read feels confusing, concerning or emotionally upsetting. Direct them to evidence-based resources, including our resources page. - Keep human relationships at the centre
AI can support access to information, but it cannot replicate empathy, trust or therapeutic judgement. Ongoing, human support remains essential for young people rebuilding their lives.
Empathy Remains A Human Strength
At the heart of every recovery journey is a human connection. Technology may offer convenience, but healing happens in relationships built on trust, compassion, and understanding. For young people navigating mental health challenges, the presence of a caring adult, mentor, or peer remains the most powerful protective factor.
This is what programs like Mission Australia’s Triple Care Farm demonstrate every day. There, connection and compassion are not coded—they are lived. Support workers, clinicians, and mentors provide young people with the space to rebuild trust, rediscover self-worth, and learn resilience in a safe, therapeutic community.
Finding Support and Starting Conversations
If you or someone you know is struggling, help is available 24/7 through trusted services:
- Kids Helpline: 1800 55 1800 – free, confidential online and phone counselling for young people aged 5–25.
- Headspace: 1800 650 890 – youth mental health and wellbeing support.
- Lifeline: 13 11 14 – crisis support and suicide prevention.
- Beyond Blue: 1300 22 4636 – mental health information and counselling.
Parents, carers and educators can also visit the eSafety Commissioner’s website for advice on AI companions and online safety,.