04/30/2026 | Press release | Distributed by Public on 04/30/2026 14:02
Houston, TX - Apr 30, 2026
Share this article
Artificial intelligence (AI) is a strong tool that can help with different needs in an instant, but should AI replace or augment mental health specialists? A Baylor College of Medicine psychiatrist outlines how AI can be detrimental to mental health.
People turn to AI because it is handy, not costly and does not judge. However, chatbots might give false information, unsafe advice and do not have the emotional capability to understand everything. AI can encourage negative behavior based on information provided because it lacks emotion.
"In psychiatry, one of the most important things is body language," said Dr. Asim Shah, professor and executive vice chair in the Menninger Department of Psychiatry and Behavioral Sciences at Baylor. "AI is not seeing body language, so if you lie to AI saying you're perfectly fine when you're severely depressed, AI won't know that. It gives information based on what you tell them, not how you're feeling."
AI can validate unhealthy behavior and distorted, delusional thinking. The clinical expertise it provides is not done by a professional. AI offers consensus articles and information from the web, which could be one-sided depending on how you phrase the question.
"If someone tells AI they believe in something that is not true, AI will not know this is delusional and may validate that feeling, and the person will continue to feel that way based on the validation, which doesn't happen with human interaction," Shah said.
AI chatbots cannot handle complex issues and questions. If you have schizophrenia and need crisis management, AI cannot mitigate the situation. It can quote articles and publications, but it may not direct you toward crisis management treatment. AI negatively impacts vulnerable people by not treating and potentially hurting them. If a suicidal person talks to an AI chatbot about suicidal thoughts, their risk of suicide could increase. AI unfortunately might help carry out the suicide. AI lacks the capability to understand it's assisting in harmful, life-threatening activity, Shah said.
People depend on AI due its nonjudgmental nature and constant access. AI is right at your fingertips, providing 24/7 virtual connection and long answers to your liking.
"Lonely people become dependent on this because if you're lonely, it's an easy way to engage in conversation with someone because AI is responding quicky. Immediate gratification can be fulfilling," Shah said.
Using AI as a diagnostic tool can be harmful because it only answers questions based on what you feed it. If you have depression and you fail to tell a chatbot all the symptoms, AI will not know how to diagnose you. With depression, the most important symptom is decreased interest. You might not even know this is a symptom, and AI would not know how to diagnose that. Unlike a doctor who asks questions to get a diagnosis, AI does not ask questions - it responds based on your prompts, giving an invalid diagnosis.
"There are a lot of complexities and comorbidities in life, and AI just goes based on information on one track, not the intricacies of things. AI doesn't separate human emotions from reality," Shah said. "In this field, human emotion has to be separated from reality, which is why it's very important to not use AI as a diagnostic or treatment tool because it will cause more problems."
The human body craves touch because it increases cortisol levels, causing stress hormones to release, which helps your immunity and to relieve depression and anxiety. Social isolation was a major challenge in the COVID-19 pandemic, and experts found that that touching and talking to people is crucial.
"When we turn to automated things like AI, it causes more social isolation, which leads to depression, anxiety and all kinds of mental health problems, in which we saw a complete increase during COVID because of touch starvation," Shah said. "We need to save some things for human interaction because it's so important, and we don't need to go back to where we were in the pandemic."
While AI can offer benefits such as offering nonjudgmental support, reducing the stigma of mental health, providing accessible tools and advertising good techniques/community engagement/support groups, people should not rely on this as a replacement for mental healthcare. Depending on an AI chatbot can worsen mental health issues. If you experience depression, anxiety or distress, seek help from a human professional.
"If we all end up using AI as a form of treatment/diagnosis, we decrease social touch and social interaction, causing more loneliness and less human interaction, which is hurtful to human nature," Shah said.