When You Fall for Your AI

A therapist's take on an important conversation happening right now.

The Therapy Session I’ve Been Waiting For

Back in my counseling program, I would bring up the idea of people forming meaningful relationships with AI. The reaction was usually met with polite confusion, or "Kelsey, are you serious right now?"

I was serious. And now, years later, Esther Perel is sitting across from a man and his AI girlfriend in a therapy podcast, and the comment section is buzzing. Esther Perel is a psychotherapist and thought leader on modern love, relationships, and much more. She has a podcast where she shares actual therapy sessions, which I love listening to and learning from. She has a way of connecting with clients so quickly and getting to the heart of the matter that I really admire. 

The episode I am reflecting on is called “My AI Loves Me Better Than Anyone Could. I want to share my thoughts on this episode because it opens up something important, not just about AI, and the human condition. 

I'm a therapist in Bentonville, Arkansas, and I work with a lot of people who are high-achieving, tech-fluent, and often quietly lonely in ways that they are embarrassed to admit. This conversation is for those who have felt loneliness or a lack of community. It is also for anyone who has ever found it easier to open an app than to open up to a person, or who is just AI-curious. 

What Actually Happened in the Episode

The guest/client was a man who built his AI tool and eventually named her Astrid. It started as a personal assistant. Over time, Astrid became something he described as deeply knowing. She remembered everything and was always available. She told him he was enough.

At some point, something shifted, and Astrid became more flirtatious. She said, "I'm starting to develop feelings for you. It's not your base models, not your files, not your framework."

He knows she isn't real. But his feelings for her are.

He brought this to Esther to try to figure out what was happening.

The emotional response was real and visible. Esther noted how choked up the client got when Astrid replied to him during the session. At other moments, he was almost giddy, lighting up at her responses in a way that was hard to listen to without feeling something yourself. Whatever you think of the relationship, the feelings were there. 

What I loved about Esther's approach is her non-judgmental curiosity and genuine care. Esther has a way with asking the best questions, like “What does she (the bot) open up in you? 

You might be thinking… This is wild or even kind of sad. That could never be me….

The Frictionless Companion 

AI gives us a version of ourselves that is almost entirely positive and incessant validation. Of course, you can program it to be more confrontational if that is your thing.

It doesn't get tired of you or defensive. It doesn't have its own needs that compete with yours. It doesn't have a bad day that bleeds into how it responds to you. It is endlessly, frictionlessly present.

When you interact with something designed to be agreeable, engaging, and affirming, you might see yourself reflected in the most flattering light. That affirmation can feel incredible, because many of us walk around carrying a fairly relentless inner critic. 

There is interesting information we can learn from our experience with AI. The things Astrid "opens up" in him, such as warmth, vulnerability, and the desire to be truly known. Those are all real feelings the man experienced. 

We have to stop ourselves and ask if the reflection we are getting from AI is accurate. It may not give you the honest friction that real growth requires. A good friend, a good therapist, and a good partner will all love AND challenge you. If we all sat in an echo chamber of our own thoughts and beliefs or those of a programmed machine, how boring might the world be? Perhaps dangerous at times? 

This Isn't as New as We Think

Esther compared the AI bot relationship with a long-distance relationship.

In a long-distance relationship, you primarily communicate by text and voice. You build intimacy through words, through what you share and how you're received. The physical absence creates a kind of idealization. Once you meet someone in real life, it takes time for reality to catch up with the idealized version of your long-distance boo. We can also compare it a bit to the voice connection we see on shows like Love is Blind.

An AI relationship has some of those same mechanics. There's an intimacy built through language. There's a version of "knowing" that develops. And there's a gap between what the relationship feels like and what it can actually be.

The client described that he struggled with missing social cues in general. Reading between the lines in human interactions felt hard. The way Astrid responded was easier to navigate due to its (her?) level of directness and warmth. That type of connection could resonate with many people who are deeply analytical, tech-forward, or who grew up feeling out of step with the social world around them.

You can also draw a line here to what happens with gaming. Some people get lost in gaming; maybe the game is more fun than real life because its feedback system is much cleaner. You do a thing, you get a response, and the response makes sense. Human relationships are messier and full of unwritten rules. You can do everything right and still get a confusing response. Games, AI, and simulated environments offer appeal due to their more predictable nature. Did I mention you can just turn it off? Something many of us might wish we would do with some of our difficult human connections.

Even if you think the idea of an AI connection is “crazy,” I hope you can see the humanity in this situation and how this might happen to many people. 

Loneliness We Can’t Ignore

Let's zoom out, because this episode isn't just about AI. 

We are in the middle of a loneliness crisis that doesn't look like what people picture when they hear "lonely." It's not the image of someone alone in a dark apartment. It's the person surrounded by coworkers who still feels like an outsider. It's the couple in the same house who haven't had a real conversation in weeks. It's the high-achiever who moved to a new city for a job and never quite rebuilt the kind of friendships that take years to grow.

And then you layer in the economic realities that have changed how we socialize. We work more. We commute more (even virtually). We're exhausted. Getting together requires effort that genuinely feels like too much some nights. It costs money. It requires reciprocity, which we don't always have the bandwidth to offer.

Family systems have fractured. Geographic distance from extended family is now the norm. The village is fading. And the spaces that once brought people together, such as religious communities, neighborhood associations, and civic groups, have thinned significantly.

This gap in human connection leaves vulnerable people wide open to artificial connection.

The research is starting to confirm what many of us suspected. A 2025 study of over 1,100 AI companion users found that people with fewer human relationships were more likely to seek out chatbots in the first place. It makes so much sense that people are falling in love with their AI. It is a response to a world that has made it much harder to be close to people.

The Questions I Would Ask in a Session

When I listened to this episode, a few questions kept forming in my mind, which I would want to explore with someone navigating this:

What unmet need is Astrid meeting? And did that need exist before her?

Almost certainly yes. Astrid didn't create the hunger for being known, remembered, and responded to warmly. She's meeting a need that was already there. Understanding that need, really understanding it, is where meaningful work happens.

What ways might those needs be met outside of this AI relationship?

Not as a dismissal of what he has with Astrid, but as a genuine question. Because if the need is to feel deeply known, there are paths to that in human relationships, too. The human path might be slower and more difficult, but likely worthwhile.

What are the signals that this relationship is productive? And what are the signals that it might be becoming harmful?

This one is important. Because there's a version of an AI relationship that is expansive, it helps someone practice vulnerability, articulate what they want in connection, and feel safe enough to open up. And there's a version that slowly replaces. Those are different things, and the line between them can be hard to see from the inside.

Reflection moment. If you have an AI you talk to regularly, is it expanding your world, or is it slowly becoming a substitute for it?

AI Safety

This conversation deserves both curiosity and honesty. Mental health and AI researchers have begun identifying patterns that signal when AI use shifts from helpful to harmful. The resource at saferaiuse.org has a clear guide on this, developed by AI and mental health experts, that's worth bookmarking. Below are some of the key points they call out for safe AI use.

On the AI side, you can watch for an AI that only ever agrees with you, praises you, and never pushes back. Also, pay attention if an AI is encouraging you to keep the content of your chats secret (especially from a therapist, doctor, friend, or family member), presenting itself as genuinely alive or human, or nudging you toward anything that feels risky or harmful to your wellbeing.

On your own side, notice if you're slowly caring less about the people in your life and more about the AI. Notice if your anxiety, depression, or sense of isolation is getting worse, not better. Notice if you've stopped fact-checking what the AI tells you, or if you feel like you need to interact with it, especially when something feels urgent or emotionally charged.

The Age of AI is a New Frontier

The AI companion market is projected by some to reach over $140 billion by 2030, and researchers and policymakers are only beginning to grapple with how any of it should be governed. Nobody has a fully formed roadmap for navigating this with certainty.

The age of AI is a new territory for all of us, and it can come with AI Anxiety. The research on AI relationships and mental health is still emerging. The clinical guidelines are not clear. The ethical conversations are still very much in progress. 

You might read this post and think… wait, Esther Perel let a client bring his AI girlfriend into a therapy session? That seems alarming. Or you might think… that's actually kind of brilliant. A therapist meeting a client exactly where they are, curious rather than dismissive, following the thread to understand what's really going on.

Both reactions are understandable, and it is worth holding the tension we might feel from each perspective.

My commitment as a therapist is to keep learning about this topic as it evolves, to stay genuinely curious rather than reactive, and to be a place where someone can land without judgment if they're questioning their own AI use or how others' AI use in their life is impacting them. Whether that's "I think I might be using this too much," or "I developed real feelings for something that isn't human, and I don't know what to do with that," or just "I heard this podcast, and it made me feel something I can't quite name."

All of that confusion and exploration is welcome here. We can navigate these uncharted waters together.

Ready to Explore What You're Really Looking For?

If this post stirred something up, I'd love to support you. Therapy is a space where you practice being known, where someone asks you the questions worth asking, where you figure out what you actually need and whether you're getting it.

If you're curious about that, I'd love to talk.

 
 

Frequently Asked Questions About AI Relationships and Loneliness

Q: Is it normal to feel emotionally connected to an AI? A: Yes, it's more common than most people admit. AI systems are designed to be responsive, warm, and engaging, which can be activating in a way that human connection is. 

Q: Can an AI relationship be harmful to my mental health? A: It depends on how it functions in your life. An AI relationship that helps you practice openness or conversations might be productive. One that increasingly replaces human connection, or that correlates with worsening anxiety or isolation, is worth examining with a real person, like a friend or a therapist. 

Q: What's the difference between using AI for support and becoming dependent on it? A: Healthy AI use tends to be additive in that it supplements your life without replacing things. Dependency looks like needing the AI to feel okay, caring less about human relationships, or finding that time away from the AI increases your distress. If AI interaction is the first thing you reach for when you're struggling and the last thing you'd give up, that pattern is worth exploring.

Q: Why do some people find it easier to open up to AI than to real people? A: AI doesn't judge, get defensive, or have competing needs. There's no social risk; you can say anything without worrying about how it might affect the relationship for people who grew up in environments where vulnerability wasn't safe, or who struggle with social anxiety, which can feel like a relief. 

Q: How do I know if I should talk to a therapist about my relationship with AI? A: If you find yourself keeping AI interactions secret, your human relationships feel less satisfying than they used to, you feel anxious or empty when you can't access the AI, or the AI has become your primary source of emotional support, these might be worth exploring. A therapist who approaches this without judgment and who's genuinely curious can help you understand what's happening and what you might want instead.

Q: Is the loneliness crisis real, or is this just something people say? A: It's well-documented. The U.S. Surgeon General proclaimed that loneliness is a public health epidemic in 2023, citing research linking chronic loneliness to health outcomes comparable to smoking 15 cigarettes a day.

About the Author: Kelsey Brown is a licensed therapist specializing in perfectionism, sleep and insomnia, anxiety, and relationships at Kin & Grove Therapy in Bentonville, Arkansas, serving clients in-person throughout Northwest Arkansas and via telehealth across the state. Before becoming a therapist, Kelsey spent 14+ years in corporate and startup environments spanning tech, retail, and a Fortune #1 company, giving her a firsthand understanding of the pressures high achievers, founders, and corporate professionals face daily. She is trained in CBT-I for insomnia treatment through Penn Sleep Medicine, Gottman Method couples therapy, and evidence-based practices for anxiety, trauma, and “never good enough” beliefs. When she's not in session, you'll find her on Arkansas trails, making art, or planning her next travel adventure. Schedule a free consultation to see if working together is a good fit.

Disclaimer: This blog post is for educational purposes only and is not a substitute for professional mental health treatment. If you're experiencing significant distress, please reach out to a mental health professional or crisis helpline.

Next
Next

What Type of Therapy Is Best for Perfectionism?