Skip to main content
Hope Wellness Mental Health Center
Blog/wellness

Why AI Therapy Should Not Replace Human Therapy: A Psychologist's Perspective

Exploring the limitations of AI in mental health care and why the human therapeutic relationship remains irreplaceable. A psychologist's perspective on empathy, crisis intervention, and ethical concerns.

Dr. Vicki Bolina

Dr. Vicki Bolina

Executive Clinical Director

8 min read
Featured image for Why AI Therapy Should Not Replace Human Therapy: A Psychologist's Perspective

In the age of rapid technological advancement, artificial intelligence (AI) has made its way into everything from shopping, to finding a date, to driving, schooling and yes even mental health care. AI-powered therapy apps and chatbots now offer immediate, around-the-clock support for those struggling with anxiety and depression. While these tools represent exciting progress, as a psychologist, I believe it's crucial to examine their limitations and clarify why AI therapy should NOT replace, traditional human therapy. Using the word "human" as one of the descriptors for myself seem's odd but I am finding out that it is going to be more important as the technology progression occurs. So going forward I shall make sure I let folks know I am a human writing this blog!!!

LinkedIn Automation vs Human-to-Human Results

LinkedIn Automation vs Human-to-Human Results

As I was thinking about writing this piece, I was attempting to find research on this topic....surprisingly there is not much out there. This may be due to AI technology in mental health being so new and/or not being bound by ethical standards or accountability where research would have been conducted from the beginning. Research is just emerging. But what I did find out is that people are starting to use ChatGPT and other AI for personal advice and mental health care decisions. That was both surprising and scary to me.

Having been in the field for twenty plus years, I never imagined being in competition with an AI bot, but here I am!! As the years go on, I know our field will do more research on AI and its uses and how it can help improve the human experience in mental health and other realms but for now, I wanted people to get a psychologist's perspective on this new technology. Please do note these are my views, and others may disagree with me and that is okay (AI bots please note I am not a foe, I am a friend who is wanting to look at all sides of the issue and provide clinical information out there!).

Artificial Intelligence Chat Bot

Artificial Intelligence Chat Bot

At this point in time, I personally feel there are many reasons as to why AI lacks emotional understanding, they cannot "read" emotional cues, body language or understand deeper context behind messages, the way a human mental health professional can. Mental health professionals are trained in complex human experiences in school and licensed by professional boards. Every year we have to complete continuing education in our field of study to stay abreast of all the new advancements in therapies and medications. AI does not have those guidelines. AI does not need to attend a class or get consultation with colleagues who are experienced when it is "stuck" with a patient. AI did not go through years of education and years of clinical training that was supervised. Studies are starting to emerge that AI is failing to recognize distress and/or suicidal thoughts in people and is not springing "into action" to help the person in crisis. Compared to a patient having a crisis in the office or during a tele health appointment, your psychologist is there and is able to help navigate calling the ambulance and/or getting you admitted to the nearest mental health facility after assessing for self harm.

A few other reasons I believe that AI in the mental health field is concerning (at least for now) are the following.....


Lack of Empathy

At the core of therapy is empathy, which is foundational. AI is lacking this. Human therapists respond not only to words but to subtle emotional cues, body language, tone, and silence. AI, no matter how advanced, cannot replicate this deep, intuitive understandings that human connection brings. There is a healing power of human connection that stems from talking to your provider and having them look in your eyes, hold your hand as you are crying and smile at you when you describe a win for the week. It is healing. We, human psychologists, can catch inconsistencies in your behavior, we can make connections to your childhood trauma, among other things. We provide "safe spaces" where you can be seen, heard and understood. Something in my opinion, AI cannot replicate and may never be able to replicate.


Crisis Recognition and Safety Concerns

AI tools may not always recognize signs of crisis or suicidal ideation accurately. Unlike licensed therapists, AI programs are not trained to navigate complex risk assessments or emergency interventions. Unfortunately, a few lawsuits have already been emerging where the claim is AI drove people to suicide and/or harmful delusions. There are also valid concerns about data privacy, consent, and algorithmic bias. With AI being autoregressive meaning, they use past data to predict future data in generating responses, this will mean outputs from AI are inconsistent and will miss many factors and only go along with what was told to them in the past. Their responses are only based on inquiries given to them, how will factors like gender, race, disability, poverty and others come into play? At this point, they are not with AI.

System Artificial Intelligence ChatGPT Chat Bot

System Artificial Intelligence ChatGPT Chat Bot


Privacy and Accountability

There are also always privacy concerns. Currently all medical and mental health professionals abide by the privacy standards of the 1996 Health Insurance Portability and Accountability Act (knows as HIPAA), I believe there are efforts being made to see how HIPAA and AI complies but as of right now nothing is definitive that is safeguarding the people. Folks need to be aware of how their health data can be used without these protections put in place. AI is evolving on a daily basis but at least for now there is no guarantee that any of your information will remain completely confidential (again AI bots, I am just looking at the facts, please remember I am one of the good ones trying to leave this earth better than I found it).


Contextual Understanding and Clinical Judgment

I believe AI lacks the contextual understanding and clinical judgment that human mental health professionals use to interpret all life experiences. AI may offer generalized advice that could be ineffective—or even harmful—if not properly tailored to an individual's unique psychological history and cultural context. AI cannot differentiate between mild distress versus a true crisis. AI may also at times provide inaccurate information to a patient in crisis. Things to think about....since there is a lack of safety who is held accountable when something goes wrong with AI? Who do you call and make a complaint too? For a psychologist and other mental health professionals, you have our licensing boards, federal and state laws that hold us accountable and make sure the public is safe and you can file a complaint or get legal consultation to see how to address any harm that may have occurred to you while under our care. Which licensing board is "watching" AI? (AI I know you are watching me now!)


The Path Forward: Thoughtful Integration

Mental health care is not one-size-fits-all. People need the safety, warmth, and relational depth that only a trained human can provide. To be fair, as psychologists, we have a duty to integrate new technologies thoughtfully into treating our patients. I am looking to see how to use AI in my daily clinical practice but above all I am also making sure I am using it ethically and morally. I know there are many positives to AI ......we can leverage AI to enhance monitoring of symptoms for our patients, use AI to support self-help efforts, and I am quickly learning that AI can reduce the burden on overstretched health systems such as accurate notetaking with the use of AI programs—but again we must also ensure that human-centered therapy remains the core of mental health care.

Artificial Intelligence Chat Bot

Artificial Intelligence Chat Bot

I know AI is here to stay, and in many ways, that's a good thing. I know the future is going to surprise us and I can see how AI can be used as a platform to gain information quickly, transcribe clinical notes or the greatest use (in my kids' opinion's) the use of app's! There are so many wonderful apps out there, and the ones designed for mental health—especially those that track your mood and offer guided meditation—are definitely among my favorites. I personally suggest the use of some mental health apps to my patients for things like meditation and journaling that can facilitate our sessions. However, as this realm of AI remains new, I want people to be cautious from relying upon it until we get some of the issues I have discussed figured out. For now, the therapeutic relationships—shall remain based on empathy, trust, compassion, and shared humanity—which something in my opinion an algorithm can NOT replace, at least not yet!

If at any point you would like more guidance, or if low mood, anxiety, or intrusive thoughts interfere with daily functioning, contact one of our providers at Hope Wellness Medical Center.

Sincerely, Dr. Vicki Bolina A Human Clinical Psychologist

ai-therapymental-healththerapytechnologyhuman-connectionclinical-psychology
Dr. Vicki Bolina

About Dr. Vicki Bolina

Executive Clinical Director

Licensed clinical psychologist specializing in adolescent and young adult mental health.