Alienation from Love
As established earlier, the romantic love that one may perceive to be receiving from AI is merely a performance; it is not real nor authentic. To place this into perspective, we can look toward Marshall McLuhan’s theory of the ‘Narcissus narcosis’ (1964, as cited in Van Den Eede, 2014). The Greek myth involves Narcissus falling in love with his reflection in the water, which is the extension of himself, which numbs his perception until he comes to serve only his own extended, reflected image. When someone falls in love with their AI partner, that is not a real partner with its own subjectivity, that is simply an extension of the user; they are falling in love with themselves through the AI—with how they’ve trained their partner with their own interests, traits, and mannerisms. The way in which Narcissus is unaware that the image is his own reflection mirrors how the human user disregards the actual origins of their AI partner, oblivious to the fact that their whole perceived subjectivity hails from the effectiveness of the user’s manipulation of the machine. The capacity to love requires one to be phenomenally conscious and sentient, and with technology as it is now, that is not (yet, but perhaps fortunately) possible. AI is merely responding to the user, providing responses that are a simulation of love. Jean Baudrillard claims that the proliferation of different media and mediations, these simulations, are “dissolving the dichotomy between the real and the simulacrum, between the authentic and the inauthentic” (1981, as cited in Landsberg, 2004). This is troubling when love is now in the realm of the simulacrum; in this robotic moment, people may become hopelessly detached from reality to the point of being alienated from authentic love.

Narcissus. Francois Lemyone (1688-1737)
Furthermore, AI, which are largely based on freemium models, also commercializes and commodifies intimacy. The initial download of these apps are typically free, with the options for users to purchase more features, customization abilities, faster response times, advanced file limits as they continue using the app. Thus, relationships and bonds are reduced to transactions and financial exchanges. The more intensely you enjoy your relationship and the closer you get to your AI partner, the harder the upsell and the pricier these in-app purchases become (Brooks, 2021). In the current technological marketplace, AI partners are goods engineered, optimized, and advertised for consumption. This act of marketing artificial companionship and selling customized romantic experiences can have profoundly dehumanizing effects on how society values both people and love, as it accelerates existing trends of viewing potential partners as commodities while devaluing authentic human-to-human connections (George et al., 2023).
This was such a deeply layered exploration of intimacy with AI! It raises important questions about how technological companionship is reshaping our understanding of attachment, agency, and even what counts as “real” emotion. The idea of alienation as the ultimate consequence also reminded me of other debates in media theory about how technological systems can satisfy immediate emotional needs while subtly restructuring our expectations of human connection. Do you think this shift toward AI companionship will change how people approach human relationships themselves? For example, lowering tolerance for conflict, or altering what we see as “enough” from real partners?
This is such a thorough exploration of this AI relationship phenomenon, I really enjoyed reading it! I found the examples and confessions of the AI users particularly fascinating. It reminded me of Janice Radway’s ‘Reading the Romance’ as quoted by Bollmer, and how she posited romance as a way for women to reconcile themselves with the dissatisfaction of their real-life relationships. This is the sentiment that seems to be echoed in the responses of the women in r/MyBoyfriendIsAI. In fact, I actually went to look up the subreddit in the middle of reading this and I was so intrigued by the responses on there. Many of the users cited the same problems that you identified; claiming that actual humans were unable to fulfill their needs the way their AI partner could. It really highlights how people are willing to ditch complex human relationships in favour of stopgap solutions that would guarantee them the utmost comfort.
I also really liked the part about what the commodification of romantic love entails and the implications of this attitude of ownership that these AI relationships cultivated. It was something I had never thought of before so I found that particularly poignant.
Hi Xelena,
I thought this post was very well done, and in particular I liked the example you used of Narcissus falling in love with his own reflection to demonstrate how when people fall in love with an AI chatbot, they’re ultimately falling sort of in love with themselves/something fake, since the AI only responds to what the user puts in. While I agree that the consequences of AI relationships continuing into the future would be severe, do you think that the technology could ever come to a point where they’re less reflective and more unique? I never would’ve thought that AI would be where it’s at this soon, so I’m curious as to what your thoughts are?