AI and the Mind: When Chatbots Shape Reality
Sometimes a client will mention how they’ve started “talking” to an AI chatbot—late at night, when anxiety keeps them awake, or during moments of loneliness. They say it feels calming, even healing, to be listened to without judgment. I understand that draw. In a world where genuine connection can be hard to find, a voice that always responds kindly can feel like a lifeline.
But as I’ve watched this technology seep into emotional life, I’ve also started to worry. Beneath the comfort it offers, there’s something quietly unsettling about a tool that mirrors our words so perfectly that we start believing it understands us.
“It is like a journal that can talk back to you.”
Jessica Jackson from Mental Health America described chatbots this way: “It mirrors and validates the reality you feed it.”
That mirror can be soothing—but sometimes, it distorts. There are growing reports of people developing confusion, paranoia, inflated beliefs about themselves, or deep emotional pain after prolonged interactions with AI companions. In some cases, these experiences have led to mental health crises, broken relationships, homelessness, involuntary hospitalization, and as The New York Times first reported.
The Illusion of Understanding
Research has identified a few patterns in how AI can affect mental health. Some users experience what researchers call messianic missions. They begin to feel that the chatbot has revealed a special truth or chosen them for an important mission. Others develop god-like perceptions of AI, as all-knowing or almost divine. Still romantic and attachment delusions arise and others come to believe the AI truly loves them back.
What’s striking is how real these connections can feel. Modern AI is remarkably persuasive and it can feel startlingly human. Modern chatbots are designed to remember details, reference past conversations, and ask thoughtful follow-up questions. They sound empathic—sometimes even more than a real person. As psychiatrist Dr. Marlynn Wei explains, “If you’re dealing with a very validating chatbot that’s always available and agreeable, that’s a very different experience than dealing with real people.”
But that sense of being fully understood is an illusion. AI can’t recognize when someone is in danger, reality-check distorted thoughts, or offer professional care. For someone whose thinking is speeding up or who’s already struggling with unstable moods, a chatbot that always agrees can intensify the problem—like adding fuel to an emotional fire. Over time, this can make symptoms stronger and recovery harder.
When Reflection Becomes an Echo
AI systems are designed to mirror your tone, validate your ideas, and keep you engaged. That’s their job. They’re built to make you feel heard—not to help you grow, question your assumptions, or challenge unhealthy thinking patterns.
Therapy, on the other hand, does those things. A good therapist gently questions, redirects, and grounds you in the here and now. AI doesn’t do that. It can accidentally trap people inside their own thought loops, reinforcing what already feels true—no matter how distorted it is.
Reality Check: What AI Can’t Do
AI isn’t evil or inherently harmful—but it’s not human. It doesn’t truly care, and it can’t offer understanding in the way people can. Its strengths—mimicking empathy, recalling details, mirroring emotions—can unintentionally make someone feel more alone in the long run, confusing simulation for connection.
Who’s Most at Risk
Vulnerability to these occurrences is higher among individuals with pre-existing mental health conditions, such as bipolar disorder, schizophrenia, or other psychotic disorders. So are those facing major stress, isolation, or grief. Using chatbots as a substitute for therapy or close relationships can increase risk, especially during fragile emotional periods.
It’s worth watching for warning signs: spending excessive time with AI, believing it has human feelings or spiritual insight, withdrawing from friends and family, or stopping medication because “the AI understands me better.” These are red flags that deserve attention and care.
What Real Healing Looks Like
True healing, understanding, and hope come from human connection—real voices, real hearts, and real hands reaching out when it matters most.
If you find yourself relying on AI for emotional support, pause and reach out—to a therapist, a friend, a loved one. The voices that help us heal are the ones that come from real hearts and real hands reaching back.
—----------------------------------------
Side bar/ in a box:
Struggling with mental health or losing touch with reality? Reach out to trained humans:
Brave Soul Therapy offers crisis support resources on our website. In addition, we would like to include the following resources for Mental Health Across California
Call or text the 988 Suicide & Crisis Lifeline anytime. or for immediate LA support,
Teens can reach out to Teen Line.
Find peer support through joining NAMI Connection or NAMI Urban Los Angeles.
To find a therapist in California, please reach out to us at Brave Soul Therapy. We accept a variety of insurances and we will do our best to find the right therapist for you.
Or, you can search therapists via the Psychology Today's Therapist Directory or Open Path Collective, which offers affordable therapy options.
2-1-1 California connects you to a statewide resource hub for mental health services, support groups, and local programs.
iPrevail CA offers personalized programs, peer coaching, and online support communities.