top of page
Search

Psychological Impacts of AI Companions and Therapeutic Chatbots: Promise, Pitfalls, and Psychological Caution

Written by Rekha Kangokar Rama Rao, Counselling Psychologist



AI Companions in Mental Health: Promise, Pitfalls, and Psychological Caution


As a psychologist, I approach AI companions and therapeutic chatbots with both clinical curiosity and professional caution.

These tools - ranging from mental health chatbots offering cognitive-behavioural strategies to emotionally responsive AI companions-are increasingly embedded in everyday life.


Their quick spread makes an important point: are they really making it easier for people to get psychological help, or are they just changing how people deal with stress, intimacy, and care?


On the positive side, AI companions and therapeutic chatbots address a long-standing gap in mental health care: access.


AI Companions in Mental Health: Promise, Pitfalls, and Psychological Caution

Globally, and particularly in low- and middle-income countries, the shortage of trained mental health professionals is severe. AI-based interventions can provide immediate, low-cost, and stigma-free support at scale. Empirical research indicates that therapeutic chatbots based on evidence-based methodologies, such as cognitive behavioral therapy (CBT), may alleviate symptoms of depression and anxiety, at least in the short term (Fitzpatrick, Darcy, & Vierhile, 2017; Inkster et al., 2018).


For individuals reluctant to seek face-to-face therapy due to stigma or fear of judgment, the perceived anonymity of AI can lower the threshold for disclosure.

AI companions may also offer emotional validation and consistency. They are always available, never impatient, and always ready to help. Research on relational agents shows that users can form emotional bonds with AI, experiencing feelings of being heard and supported (Ta et al., 2020).


AI Companions in Mental Health: Promise, Pitfalls, and Psychological Caution

For individuals experiencing loneliness or social isolation, such interactions may serve as a psychological buffer, reducing distress and promoting a sense of connection.


However, these benefits come with significant psychological and ethical trade-offs.


A central concern is the illusion of empathy.

While AI can simulate empathic responses through language models, it does not possess emotional understanding, moral reasoning, or clinical judgment.


From a psychological perspective, empathy transcends mere verbal affirmation; it is rooted in collective human experience, emotional resonance, and moral obligation.


Overreliance on simulated empathy risks fostering what Cheung (2013) describes as “relational displacement,” where technologically mediated interactions replace, rather than supplement, human relationships.


There is also the danger of emotional dependency.


Some users may begin to prefer AI companions precisely because they are predictable and non-confrontational.

AI Companions in Mental Health: Promise, Pitfalls, and Psychological Caution

This can subtly reinforce avoidance coping, limiting opportunities to develop tolerance for interpersonal complexity, conflict, and emotional vulnerability with real people. Longitudinal data on this risk are limited, but early evidence suggests that excessive engagement with relational AI may correlate with increased loneliness and reduced motivation for offline social interaction (Peng et al., 2025).


Therapeutic chatbots also pose clinical safety issues.


They can help with mild to moderate symptoms, but they aren't good at handling crises, complex trauma, or severe psychopathology.


Misplaced trust in AI tools could delay the right professional help, especially if users think that chatbot reassurance is the same as a clinical assessment. Moreover, algorithmic bias and data privacy concerns inflict further psychological harm, particularly on marginalized groups whose experiences may be insufficiently reflected in training datasets (Blease et al., 2019).


The strongest argument, then, is not whether AI companions and therapeutic chatbots are “good” or “bad,” but how they are positioned within the mental health ecosystem.


When framed as supplements-tools for psychoeducation, emotional regulation practice, and early support - they hold genuine promise.


When framed as substitutes for human care, they risk flattening the depth of psychological healing into scripted responsiveness.


Ultimately, the psychological impact of AI companions forces us to confront a deeper question: are we using technology to expand human connection, or to avoid it?

The answer will depend less on the sophistication of the algorithms and more on the ethical, clinical, and relational boundaries we choose to maintain.


References 

Blease, C., Kaptchuk, T. J., Bernstein, M. H., Mandl, K. D., Halamka, J. D., & DesRoches, C. M. (2019). Artificial intelligence and the future of primary care: Exploratory qualitative study of UK general practitioners’ views. Journal of Medical Internet Research, 21(3), e12802. https://doi.org/10.2196/12802

Cheung, J. C. S. (2013). Alone together: Why we expect more from technology and less from each other. Taylor and Francis.

Fitzpatrick, K. K., Darcy, A., & Vierhile, M. (2017). Delivering cognitive behavior therapy to young adults with symptoms of depression and anxiety using a fully automated conversational agent (Woebot): A randomized controlled trial. JMIR Mental Health, 4(2), e19. https://doi.org/10.2196/mental.7785

Inkster, B., Sarda, S., & Subramanian, V. (2018). An empathy-driven, conversational artificial intelligence agent (Wysa) for digital mental well-being: Real-world data evaluation. JMIR mHealth and uHealth, 6(11), e12106. https://doi.org/10.2196/12106

Peng, C., Zhang, S., Wen, F., & Liu, K. (2025). How loneliness leads to the conversational AI usage intention: The roles of anthropomorphic interface, para-social interaction. Current Psychology44(9), 8177-8189.

Ta, V., Griffith, C., Boatfield, C., Wang, X., Civitello, M., Bader, H., & DeCero, E. (2020). User experiences of social support from companion chatbots in everyday contexts: Thematic analysis. Journal of Medical Internet Research, 22(3), e16235. https://doi.org/10.2196/16235


Bonolo Mophosho, psychologist who wrote the blog.
If you want to read more about Rekha and the services she offers, click here.

 
 
 

Comments


bottom of page