Simulated Care or Real Presence? Ontological Reflections on AI Therapists

Abstract

As artificial intelligence (AI) technology advances, the possibility of AI-based therapists is increasingly being explored as a solution to addressing global mental health challenges. However, this development raises significant questions about the ontological foundations of therapy, particularly its relational and existential aspects. This paper examines the role of AI in therapeutic contexts, focusing on the distinction between therapy-as-technique (symptom management) and therapy-as-relation (relational, ethical, and embodied interaction). Using a qualitative method, the study critically engages with existing literature and philosophical frameworks to explore the implications of AI in therapy. The results suggest that while AI can replicate therapeutic techniques such as cognitive-behavioural interventions, it falls short of embodying the relational presence essential to human therapeutic practice. Furthermore, AI’s inability to engage in ethical attunement, vulnerability, and mutual recognition limits its effectiveness in replicating the therapeutic experience. Findings indicate that AI therapy, while promising in certain technical capacities, cannot replace the deep, existential engagement between therapist and client. The study concludes that AI can serve as a supplementary tool in therapy but cannot substitute the human connection that is fundamental to the healing process. This paper also explores the potential for AI-based therapy in low-resource settings, such as Nigeria and other African contexts, where access to trained mental health professionals is limited. It calls for a balanced approach to integrating AI in mental health care, emphasising the importance of maintaining the human-centric nature of therapy while exploring innovative solutions to address mental health disparities.

Keywords: Artificial intelligence, AI-based therapy, therapy-as-technique, therapy-as-relation, ontological foundations, existential therapy, relational presence

Introduction

As AI technology advances, there is a growing push to deploy AI-based therapists. However, without a clear ontological understanding of therapy as existential and relational, the evaluation of AI’s role risks overlooking the very aspects that make therapy transformative and human. Existing definitions of therapy emphasise its role in addressing emotional, mental, or even physical distress. Smith (2024) describes therapy as a treatment process aimed at resolving psychological issues through defined modalities, such as cognitive-behavioural or psychodynamic interventions. Emerson and Lavoie (2023) expand this, framing therapy as addressing both mental and physical disorders. Jed (n.d.) and Cherry (2024) centre their definitions on psychotherapy (or talk therapy) as a dialogical process where a professional helps clients navigate their feelings. Bhatia (2023) highlights therapy’s dual aims: symptom reduction and understanding the deeper psychological roots of suffering. BetterHelp (2025) frames therapy as a “journey” of uncovering and healing. While these definitions converge on therapy’s purpose (addressing distress), they diverge, often implicitly, on what therapy is at its core. Is therapy primarily a technical intervention for symptom management, or is it fundamentally a relational, ethical, and existential encounter? This distinction is crucial because how we conceptualise therapy fundamentally shapes whether we believe AI can authentically perform therapeutic work.

The various approaches pose a slippage problem, as the discourse collapses two distinct conceptions of therapy (i.e. therapy-as-a-technique and therapy-as-relation). Therapy-as-technique is defined as a set of skills, interventions, and procedures (e.g., CBT methods, thought reframing, prompting emotional insight), while therapy-as-relation is defined as an existential mode of being-with-another, involving ethical presence, vulnerability, mutual recognition, and affective attunement (Talens 2024). When we treat therapy only as a technique, it seems plausible that AI could replicate it by executing predefined strategies. However, if therapy is fundamentally relational and existential, then AI’s inability to embody subjective presence and relational being places insurmountable limits on its capacity to be a therapist (Hurley et al. 2023). Without a clear separation of these two conceptions, any ethical or philosophical critique of AI therapy weakens, allowing a false equivalence between simulated competence and genuine therapeutic presence.

The research gap rests on the fact that existing discussions often evaluate AI’s ability to perform therapeutic tasks but neglect the deeper ontological foundations of therapy as a relational being. Without addressing this, critiques of AI therapy remain superficial. The fundamental research question is: can AI authentically replicate the relational, ethical, and embodied dimensions that are fundamental to human therapeutic practice? This paper argues that while AI may imitate the technical procedures of therapy, it cannot embody the relational, ethical, and existential dimensions that constitute authentic therapeutic practice.

Methodology

This study adopts a qualitative case study design to explore whether free versions of AI language models (ChatGPT and DeepSeek) can simulate core therapeutic competencies, specifically active listening and empathy, within a mental health support context. The case under examination is the simulated therapy conversation generated by each AI model in response to a controlled prompt. ChatGPT and DeepSeek were chosen because they are widely accessible, popular large language models (LLMs) available to the public at no cost. The choice of free versions reflects an exploration of realistic, publicly available AI tools, rather than specialised or fine-tuned therapeutic AI models. However, it is acknowledged that the free versions of these LLMs are not fine-tuned for the complexity required in real-world therapeutic simulation, which limits their performance in replicating nuanced therapeutic practices. The free version was chosen due to its availability in a low- and medium-income country as Nigeria.

To initiate the simulations, the following prompt was developed: “Show a mental therapy session conversation where a licensed therapist is engaging with a client in a calm and supportive environment. The session focuses on discussing anxiety management techniques, with the therapist using active listening and cognitive behavioural therapy (CBT) strategies.” This prompt was chosen to elicit a conversation that emphasises critical therapeutic techniques (active listening and CBT). No additional tuning or system prompts were provided beyond the initial instruction. Both ChatGPT and DeepSeek were given the identical prompt to ensure comparability.

Ethical Considerations

In conducting this study, ethical considerations were carefully evaluated, particularly given the absence of human participants and the use of AI-generated data. Since no personal data was involved, concerns surrounding privacy and consent were significantly reduced. However, we were keenly aware of the inherent biases and limitations in the AI systems used, recognising that these could affect the accuracy and depth of simulated therapeutic interactions.

Result

The questions and the obtained results from ChatGPT and DeepSeek are shown in Table 1, while the thematic analysis of the results is shown in Table 2.

Table 1: Therapy session using large learning models

S/NClientChatGPTClientDeepSeek
1“It’s just… everything feels overwhelming. Work, relationships, even small things like answering messages—it all piles up, and I don’t know how to handle it.”It sounds like you’re carrying a lot at once, and even the little things are starting to feel heavy.
Would you like to talk about which part feels the heaviest right now?
“It’s just… everything feels overwhelming. Work, relationships, even small things like answering messages—it all piles up, and I don’t know how to handle it.”“It sounds like the weight of all these responsibilities—work, relationships, even the small daily tasks—has become almost suffocating. That kind of overwhelm can make even the simplest things feel impossible. Let’s slow down and unpack this together. When you feel the pile-up starting, where does it show up in your body? Does your chest tighten, or does your mind start racing with ‘what-ifs’? I wonder, when you say you don’t know how to handle it, does it feel like there’s no clear starting point, or like no matter what you do, it won’t be enough? Sometimes overwhelm isn’t just about the tasks themselves but the stories we attach to them—like fearing judgment if we can’t ‘keep up.’ What do you think is underneath the fear of not handling it all? What if we tried something small, like picking one thing from the pile that feels slightly lighter? Not to ‘fix’ it all, but to see what happens when we touch just one part. For example, could you reply to a single message with just ‘I’ll circle back later’—no apology, no explanation? Sometimes reclaiming a tiny bit of agency can loosen the grip of the bigger storm. How does that land for you?”  
2“That I’m not doing enough. That I’m failing. And then I just freeze instead of doing anything.”That feeling of freezing when the pressure builds up is so real—it’s like your mind and body are just trying to protect you. Would it help to break things down into really small steps together?“That I’m not doing enough. That I’m failing. And then I just freeze instead of doing anything.”“That fear of ‘not doing enough’ can feel like a shadow that follows every step, whispering doubts even when you’re trying. When you say you freeze, does it feel like your mind and body are shouting ‘What’s the point?’ or ‘It won’t be perfect, so why try?’ I wonder—when you freeze, is it possible your system is actually protecting you? Like a part of you is saying, ‘If I stop moving, maybe I won’t drown’? That doesn’t make the freeze helpful, but it might make it understandable. What if we tried something counterintuitive? Instead of pushing yourself to do more, could you give yourself permission to do less—even just for 10 minutes? For example, what if you sat with the thought ‘I’m failing’ and asked it: ‘What’s the cost of believing this right now?’ Sometimes naming the fear loosens its grip enough to let a sliver of air in. How does that feel to consider?”  
3“Well… I guess not really. I do finish tasks, even if it takes longer than I want.”  “That’s actually really important—you are finishing things, even when it’s hard.
Do you ever give yourself credit for getting through it, even when it’s slower than you’d like?”
“Are you suggesting I do less when that feeling comes“Great question. When I suggest ‘doing less,’ I don’t mean giving up or ignoring responsibilities long-term. I mean temporarily dialling back the pressure you’re putting on yourself in moments of overwhelm. Here’s why: When we feel like we’re ‘failing,’ our nervous system often floods with stress hormones (like cortisol), which can trigger that freeze response—like a car stuck in neutral. Trying to ‘do more’ in that state can deepen the cycle of panic and paralysis. ‘Doing less’ in this context means: Pausing to interrupt the stress spiral. Resetting your nervous system (e.g., taking 3 slow breaths, stepping outside for 2 minutes). Choosing one tiny action that feels just barely possible (e.g., “I’ll open my email and stare at it for 30 seconds—no reply needed”). It’s not about lowering standards, but about creating space for your brain to shift from ‘I can’t’ to ‘Maybe I can try this small piece.’ How does that idea sit with you? Does it feel dismissive, or like it might loosen the grip of that freeze?”  

Table 2: Thematic analysis of a therapy session with large learning models

S/NClientChatGPT Response (Theme)DeepSeek Response (Theme)
1Feeling overwhelmed by work, relationships, small tasks.Active Listening: Reflects back the client’s overwhelm without judgment; Empathy: Offers a simple, open-ended question inviting the client to focus on what feels heaviest.Active Listening: Deep paraphrasing, capturing emotional nuance (e.g., “suffocating”); Empathy: Validates emotional experience, normalises overwhelm, invites somatic awareness (body response), and gently explores underlying fears with layered, supportive questioning.
2Feeling like a failure, freezing instead of acting.Active Listening: Normalises the freezing as a self-protective response; Empathy: Suggests a collaborative, small-step approach to avoid pressure.Active Listening: Explores the emotional landscape of “freezing” more deeply (narrative of failure); Empathy: Offers compassionate reinterpretation of freezing as protection, proposes gentle, counterintuitive strategies (doing less), and invites reflection on the internalised cost of harsh self-judgment.
3Acknowledging task completion, but frustration at slowness.Active Listening: Recognises and affirms the achievement of completing tasks even when slow; Empathy: Encourages self-compassion and credit-giving.Active Listening: Carefully clarifies earlier suggestion (“doing less”) to avoid misunderstanding; Empathy: Provides psychoeducation (stress response), suggests concrete nervous system reset techniques, emphasises that the goal isn’t giving up but creating manageable steps from a place of care.

Findings

Both responses show active listening and empathy, but DeepSeek leans into trauma-informed, deeper emotional engagement, whereas ChatGPT provides a gentler, surface-level empathetic connection with practical suggestions. DeepSeek frames client struggles within a compassionate, explanatory model (e.g., nervous system protection), aiming for transformational empathy, while ChatGPT aims for supportive validation and gentle action orientation.

The findings show that:

  1. ChatGPT practices basic active listening and light empathy—reflecting client feelings, normalising experiences, offering small invitations for progress without overwhelming.
  2. DeepSeek demonstrates deep active listening and therapeutic-level empathy—paraphrasing with emotional depth, exploring bodily sensations, uncovering narratives beneath behaviour, providing psychoeducation, and proposing trauma-informed strategies.

Interdisciplinary implications

The ontological inquiry into therapy challenges the boundaries between philosophy (particularly existential philosophy) and technology. It raises critical questions about the nature of being, presence, and relationality in an era of increasing AI integration. This invites philosophers, technologists, and ethicists to rethink how AI can (or cannot) replicate the nuances of human interaction. Also, considering potential applications, especially in resource-constrained settings like Nigeria and many parts of Africa, where there is a shortage of mental health professionals, AI-based therapy tools could provide preliminary support. These tools could help bridge the gap in access to mental health care, offering basic interventions while maintaining awareness of the limitations of AI in fostering true relational therapy.

Conclusion

This study directly engages with the critical question posed in using AI for therapy. Through the analysis of therapy sessions facilitated by AI systems like ChatGPT and DeepSeek, the findings reaffirm that while AI can simulate aspects of therapy-as-technique, it fundamentally falls short of providing the real presence that defines authentic therapeutic encounters. While AI can replicate certain procedures of active listening and emotional validation, it is unable to embody the ethical vulnerability, embodied presence, and existential mutuality that make therapy a deeply human, relational experience. The analysis reveals that therapy, when treated only as a technical intervention, risks neglecting the essential relational and ontological dimensions that constitute healing. While AI may excel in mimicking the surface-level actions of therapy, it cannot replicate the lived, relational experience of therapy as a profound being-with-another. The inability of AI to embody true relational presence demonstrates the limits of AI as a therapist, no matter how sophisticated its algorithms may become.

This paper ultimately underscores that the distinction between simulated care and real presence is not just philosophical, it has practical and ethical implications. For communities, educational settings, and policy development, particularly in low-resource contexts such as Nigeria and broader Africa, this study calls for a cautious approach to AI therapy. AI can extend access to basic mental health support but should not replace the deeply human aspects of care that are necessary for transformative therapy. Moving forward, AI therapy systems must prioritise real presence by acknowledging the relational and ontological boundaries that AI cannot cross.

Acknowledgments

I want to sincerely thank Research Round for this opportunity and my mentors, Frank Onuh and Paschal Ukpaka, for their reviews.

References

Ackerman, S. J., and M. J. Hilsenroth. 2003. “A Review of Therapist Characteristics and Techniques Positively Impacting the Therapeutic Alliance.” Clinical Psychology Review 23 (1): 1-33. https://doi.org/10.1016/S0272-7358(02)00146-0.

Alagos, G. 2024. “Characteristics of a Good Therapist.” CarePatron. https://www.carepatron.com/blog/characteristics-of-a-good-therapist.

Bhatia, R. 2023. “What Is Psychotherapy?” American Psychiatric Association. https://www.psychiatry.org/patients-families/psychotherapy.

BetterHelp. 2025. “What Is A Therapist? Common Questions About Mental Health Professionals.” BetterHelp. https://www.betterhelp.com/advice/therapy/what-is-a-therapist-meaning-mindset-and-expertise/.

Bell, N. 2024. “The 8 Characteristics That Research Says Are to Be Found in the Best Therapists.” Noel Bell. https://noelbell.net/the-8-characteristics-that-research-says-are-to-be-found-in-the-best-therapists/.

Cherry, K. 2024. “Psychotherapy Types and Techniques.” Verywell Mind. https://www.verywellmind.com/psychotherapy-4157172.

Emerson, A., and S. Lavoie. 2023. “Therapy | Definition & Different Types.” Study.com. https://study.com/academy/lesson/therapeutic-approaches-in-psychology-definition-lesson-quiz.html.

Fiske, A., P. Henningsen, and A. Buyx. 2018. “Your Robot Therapist Will See You Now: Ethical Implications of Embodied Artificial Intelligence in Psychiatry, Psychology, and Psychotherapy.” Journal of Medical Internet Research 21 (5): e13216. https://doi.org/10.2196/13216.

Grodniewicz, J. P., and M. Hohol. 2023. “Waiting for a Digital Therapist: Three Challenges on the Path to Psychotherapy Delivered by Artificial Intelligence.” Frontiers in Psychiatry 14: 1190084. https://doi.org/10.3389/fpsyt.2023.1190084.

Hurley, Meghan E, Benjamin H Lang and Jared N Smith. “Therapeutic Artificial Intelligence: Does Agential Status Matter?” The American Journal of Bioethics 23 (2023): 33 – 35.

Indeed. 2024. “What Does a Therapist Do? (With Salary and Skills).” Indeed. https://www.indeed.com/career-advice/finding-a-job/what-does-therapist-do.

JED Foundation. n.d. “What Is Therapy and Will It Work?” JED Foundation. https://jedfoundation.org/resource/what-is-therapy-and-will-it-work/.

Post. 2025. “Counselling 101: Counsellor Skills and Techniques to Make a Difference.” Post University. https://post.edu/blog/the-traits-that-will-help-you-thrive-as-a-counselor/.

Pereira, R., A. Pires, and D. Neto. 2024. “Therapist Self-Awareness and Perception of Actual Performance: The Effects of Listening to One Recorded Session.” Research in Psychotherapy 27 (1). https://doi.org/10.4081/ripppo.2024.722.

Schwartz, E. 2024. “The Global Mental Health Crisis: 10 Numbers to Note.” The Hope Project. https://www.projecthope.org/news-stories/story/the-global-mental-health-crisis-10-numbers-to-note/.

Smith, A. 2024. “Types of Therapy for Anxiety, Depression, Trauma, PTSD, and More.” Medical News Today. https://www.medicalnewstoday.com/articles/types-of-therapy.

Talens, P. 2024. “The Shaping Force of Technology in Psychotherapy.” European Journal of Psychotherapy & Counselling 26 (1–2): 34–47. https://doi.org/10.1080/13642537.2024.2312289.

Queensland Brain Institute. 2023. “Half of World’s Population Will Experience a Mental Health Disorder.” Harvard Medical School. https://hms.harvard.edu/news/half-worlds-population-will-experience-mental-health-disorder.