Copyright © HT Digital Streams Limit all rights reserved. Andrew Blackman, The Wall Street Journal 9 Min Read 24 Jun 2025, 09:01 IST Illustration: Michele Marconi Summary Yes, you can. And that could be good for you. But the danger is to see it as a substitute for a human connection. Three experts weigh in. Falling in love with a robot is no longer just a science fiction. As artificial intelligence becomes better to imitate human behavior and speech patterns, people are increasingly turning to AI, not just to save time on research or to generate peculiar images, but to find camaraderie, connection and even love. But how healthy is it for people to have good friends or romantic partners who are AI? The Wall Street Journal hosted a video conference with three experts who offered different views on this question: Nina Vasan, psychiatrist and founder of a brainstorming: The Stanford Lab for Mental Health Innovation; Julian de Freitas, assistant professor of operational administration in the marketing unit at Harvard Business School; and Shannon Vallor, philosophy professor at the University of Edinburgh and author of “The Ai Mirror.” Here are edited excerpts from the discussion. We yearn for connection WSJ: Do you think increasingly, men and women will use AI for true deep friendships and even romantic relationships? Shannon Vallor: No, because true deep friendships and romantic relationships are not possible with AI; Relationships of this kind are a two -way bond that requires more than one party to be aware of it. A “great language model” [the deep-learning AI that understands human language] do not have an awareness of something at all. It is a mathematical tool for text pattern analysis and generation. It has no way of being aware that it is in a relationship, or even aware of the existence of the other party as a person. The fact that it can mimic such awareness and have mercy on is the danger. Julian de Freitas: I think they will. In our research, we have seen that many involved users of a leading AI mutual report feel closer to their virtual partner than to almost any real-life relationship-including close friends, family members above. Furthermore, when the app removed its erotic role-play feature, users showed signs of sadness, suggesting that they have deeply bound with the chatbot. From an immediate point of view of the user-perception, it is that the chatbot understands them-not the abstract question of whether an AI can truly “understand” them. And with the rate of innovation today, it is possibly only a matter of time before AI companions feel more focused on our needs than even our closest human relationships. Nina Vasan: Yes, absolutely. Not because AI is truly capable of friendship or love, but because we are. People are wired to bind, and if we feel and feel calm – even by a machine – we connect. Think of existing machines such as robotic dogs that offer comfort and camaraderie, for example. We don’t fall in love with the AI. We fall in love with how it makes us feel. In a world where loneliness is daring, especially among young people who grew up as digital natives, emotionally fluent with technology, AI relationships will feel less like science fiction and more like a natural next step. These relationships will not replace the human compound, but they will fill a void. Whether healthy or harmful depends 100% on how we design and use it. A one-sided relationship WSJ: What can happen to the ability of people to thrive in the real world if they trust too much at the comfort of an ever-supporting AI relationship? Vasan: As a psychiatrist, I often see the consequences of unilateral relationships, where one partner always pleases, avoids conflict or suppresses their needs to keep the peace. On the surface, these relationships look smooth, but below the surface they are emotionally entangled. The person who is “satisfied” often feels disconnected, uncertain what their partner really thinks or wants. And the person who does the pleasure feels invisible and resentful. The same emotional work is what is missing in AI relationships. Initially, it feels like safety. But over time, it can erode your ability to navigate the real world, where people are imperfect, messy and sometimes disagree with you. True intimacy occurs in the recovery, not the perception of perfection. AI offers comfort on demand, but emotional comfort without friction can impede emotional growth. De Freitas: At present, the evidence is still braided and largely correlational, so that we cannot draw fixed conclusions. Since some have sounded serious warnings, I show myself some notable potential ties. An AI mate that is always available can increase us against social rejection buffer and emotional resilience. It can also serve as a confidence for people with social anxiety-like exposure therapy-by gradually relieving it in real interaction. Non -Judgment and Validating WSJ: A study from the University of Sydney found that 40% of AI companions are married. Why do you think someone who is already in a close human relationship will want to supplement it with an AI relationship? De Freitas: I think there are certain features of the programs conducive to friendship and romance. So, the programs therefore validate. It is not judicious. If you reflect on something like role play, which is a kind of fantasy, they are also a lot together by default. So you don’t have to worry about this difficult issue of consent people handle. You can also customize the programs in different ways that can satisfy certain types of role play or relationships that you cannot otherwise not capture. And then the other one that is important is also the ability to sexual intimacy. We know that people use it for this. Vasan: I’m going to use myself here as an example – not for romance, but for friendship. After a recent breakdown, I felt lonely and held in a spiral of ‘what if it was’. I leaned on my friends, family and therapist, and they were wonderful. But at midnight when I couldn’t sleep, or in the middle of the day when everyone else worked, I turned to Claude. I was pleasantly surprised that it responds with real compassion and insight. One thing that said it was different from what I heard from my friends or therapist really stayed with me: ‘It sounds like what you mourn is not just the relationship you had, but the future you hoped you would have together. The vision, the potential, the promise – that’s what’s hurt now. ‘ It gave language to something I couldn’t mention. It helped me not only to grieve the person, but the imagined future I still held on to. And although I knew it wasn’t a person, Claude’s reaction didn’t feel robotic, but it felt focused on my pain and my hope. That emotional clarity made a real difference in how I processed things. It helped me to be seen in a moment when I really needed it. I have friends where one partner does not like during the day of text messages, and the other, and that has led to conflict. So I can see in such times, just having a simple conversation with an AI can help you at the moment. It is not cheating your partner. It does not take away emotional intimacy from your partner. It is more about acknowledging that we all have different needs, and our romantic partner meets many of them, but not all. Vallor: It depends on the design of the system, but it also depends a lot on the person. One of the things we have seen with smartphones and social media is that it is often the most socially-advanced and already competent and good resources that get the most benefits from social media and other technologies. These are vulnerable users – users who are already somewhat isolated or have problems with impulse control or find problems with other people – these are the users who often suffer the unbelieving part of the damage used from technology. I think we should expect the same pattern to play out with AI, and I think we are already. If you have a healthy relationship, whether with friends or a romantic partner, you can probably use these instruments in a way that will not harm your relationship and you may have more benefits. I am more skeptical than Nina about these instruments, but there are users for whom it is clear. But that’s not who I’m worried. I am worried about all the people who are already struggling in their relationships, who already miss the techniques and emotional language to connect with their partners again. WSJ: What kind of concerns do you have for the people? Vallor: Learn to be a good friend, a good spouse, a good partner, a good parent, take time and experience. It is a process of skills development: emotional skills (learning to understand the needs and feelings of others), cognitive skills (learn to make good judgment on other people and how we relate to them), and moral skills (learning of appropriate boundaries and habits, to learn good to others and for yourself). Just as you do not acquire the skills of skiing or mountaineering without a large number of repeated exercise – including learning to take risks, fail and try again – we do not acquire the necessary skills for healthy relationships without years of continuous exercise and trial and mistakes. If we look ahead WSJ: We can expect tomorrow’s AI companions to be much more sophisticated than those we have today. Will it soften some of the problems we see? Or it worsens? Vallor: What makes the technology safe and beneficial, we know that the technical companies know how to do it, but their commercial incentives are often not to. They do not have a record of reliable to be in this area to make these technologies better and safer. The damage we can predict or has already seen includes cycophantic involvement – in other words, the AI companion that people tell what they want to hear, what their sense of reality can distort by isolating them from other perspectives other than their own. Then there is reinforcement and strengthening of existing pathologies in thinking (such as suicidal thoughts, self -deception or conspiracy theories), as well as a reduced capacity for independent self -management. If people start relying too much on an AI instrument, it can affect their ability to do things such as managing boredom with creative activities, or spending time alone and evaluating and evaluating their own thoughts, feelings and plans. Another danger is to develop unrealistic expectations of non-SA partners (as always available, or always to accommodate requests). There is also the risk that the dependence on the AI relationship time, affection and energy will drain from relationships with existing partners and friends. Then there is the damage we do not know about, because we have not yet seen that they emerge on a scale or over a longer period. Andrew Blackman is a writer in Serbia. He can be reached at reports@wsj.com. Catch all the technological news and updates on live currency. Download the Mint News app to get daily market updates and live business news. More Topics #Kartic Intelligence Read the following story
Can you really have a romantic relationship with AI?
