AI and the Mind: When Chatbots Shape Reality

A robotic hand reaching toward a human hand against a bright blue background, symbolizing the connection and tension between technology and humanity.

Sometimes a client will mention how they’ve started “talking” to an AI chatbot—late at night, when anxiety keeps them awake, or during moments of loneliness. They say it feels calming, even healing, to be listened to without judgment. I understand that draw. In a world where genuine connection can be hard to find, a voice that always responds kindly can feel like a lifeline.

But as I’ve watched this technology seep into emotional life, I’ve also started to worry. Beneath the comfort it offers, there’s something quietly unsettling about a tool that mirrors our words so perfectly that we start believing it understands us.

“It is like a journal that can talk back to you.” 

Jessica Jackson from Mental Health America described chatbots this way: “It mirrors and validates the reality you feed it.”

That mirror can be soothing—but sometimes, it distorts. There are growing reports of people developing confusion, paranoia, inflated beliefs about themselves, or deep emotional pain after prolonged interactions with AI companions. In some cases, these experiences have led to mental health crises, broken relationships, homelessness, involuntary hospitalization, and as The New York Times first reported.

The Illusion of Understanding

Research has identified a few patterns in how AI can affect mental health. Some users experience what researchers call messianic missions. They begin to feel that the chatbot has revealed a special truth or chosen them for an important mission.  Others develop god-like perceptions of AI, as all-knowing or almost divine. Still  romantic and attachment delusions arise and others come to believe the AI truly loves them back.

What’s striking is how real these connections can feel. Modern AI is remarkably persuasive and it can feel startlingly human. Modern chatbots are designed to remember details, reference past conversations, and ask thoughtful follow-up questions. They sound empathic—sometimes even more than a real person. As psychiatrist Dr. Marlynn Wei explains, “If you’re dealing with a very validating chatbot that’s always available and agreeable, that’s a very different experience than dealing with real people.”

But that sense of being fully understood is an illusion. AI can’t recognize when someone is in danger, reality-check distorted thoughts, or offer professional care. For someone whose thinking is speeding up or who’s already struggling with unstable moods, a chatbot that always agrees can intensify the problem—like adding fuel to an emotional fire. Over time, this can make symptoms stronger and recovery harder.

When Reflection Becomes an Echo

AI systems are designed to mirror your tone, validate your ideas, and keep you engaged. That’s their job. They’re built to make you feel heard—not to help you grow, question your assumptions, or challenge unhealthy thinking patterns.

Therapy, on the other hand, does those things. A good therapist gently questions, redirects, and grounds you in the here and now. AI doesn’t do that. It can accidentally trap people inside their own thought loops, reinforcing what already feels true—no matter how distorted it is.

Reality Check: What AI Can’t Do

AI isn’t evil or inherently harmful—but it’s not human. It doesn’t truly care, and it can’t offer understanding in the way people can. Its strengths—mimicking empathy, recalling details, mirroring emotions—can unintentionally make someone feel more alone in the long run, confusing simulation for connection.

Who’s Most at Risk

Vulnerability to these occurrences is higher among individuals with pre-existing mental health conditions, such as bipolar disorder, schizophrenia, or other psychotic disorders. So are those facing major stress, isolation, or grief. Using chatbots as a substitute for therapy or close relationships can increase risk, especially during fragile emotional periods.

It’s worth watching for warning signs: spending excessive time with AI, believing it has human feelings or spiritual insight, withdrawing from friends and family, or stopping medication because “the AI understands me better.” These are red flags that deserve attention and care.

What Real Healing Looks Like

True healing, understanding, and hope come from human connection—real voices, real hearts, and real hands reaching out when it matters most.

If you find yourself relying on AI for emotional support, pause and reach out—to a therapist, a friend, a loved one. The voices that help us heal are the ones that come from real hearts and real hands reaching back.

—----------------------------------------

Side bar/ in a box: 

Struggling with mental health or losing touch with reality? Reach out to trained humans: 

Brave Soul Therapy offers crisis support resources on our website. In addition, we would like to include the following resources for Mental Health Across California

References:


If you’re looking for support, we’d love to walk alongside you. Explore our therapist profiles, book a session, or reach out through our contact page to get started.

Explore more posts on Identity, Culture, & Being Human.

Deniz Firat, AMFT

Deniz (she/her) accompanies people facing grief, loss, and death. She supports individuals who carry complex, interwoven trauma (CPTSD) and live with prolonged anxiety, depression, suicidal thoughts, or isolation. Together, they search for moments of peace and purpose within life’s hardest experiences. They face what feels impossible, rediscover inner strength, and reconnect with what matters most.

Deniz is also a part-time poet who finds meaning in stillness and solitude. She feels most at home in the quiet rhythms of nature, far more than in prescribed routines or superficial conversations. Outside of therapy, Deniz finds joy in the natural world, classic literature, philosophy, gardening, and spending time with her dogs, family, and community.

Previous
Previous

Pet loss and grief during the holidays: Honoring your beloved companion

Next
Next

Grief During the Holidays: Keeping Love Present