AI as Extensions, & Prosthesis for Love
To assess how AI affects love, one must first establish an understanding of what love is. Romantic love is a continuation of the process of attachment, which is the feeling of affection for a person, object, or institution, developing through repeated exposure. Levy applies this concept to technology through the psychological term of “material possession attachment,” which is the attachment one holds for an object that can develop into (2007). A stronger relationship due to the repeated use and interaction with the possession. As an owner uses an object and interacts with it more and more over time, the object becomes increasingly personalized, creating a special meaning for the owner. It becomes part of its owners’ being, irreplaceable in the mind of its owner—“we become extended by our possessions, they become part of us, extending us” (Levy, 2007, p. 30). Yet, the nature of possession attachment to a computer is different because of the element of control, which engenders a kind of love.
The relationship that humans have with their computer is something that has fascinated psychologists and theorists over time. In The Second Self (1984), Sherry Turkle states that one’s “relationship with a computer can influence people’s conception of themselves, their jobs, their relationships with other people, and with their ways of thinking about social processes” (p. 156). In Alone Together (2011), she emphasizes the notion that “the computer, a machine on the border of becoming a mind, [is] changing and shaping us” (p. x). Yoni Van Den Eede (2014) takes this a step further by arguing that the human being is an incomplete and deficient creature, and we compensate by creating and deploying tools and prostheses, which is why we use computers as a brain-prosthesis by attaching ourselves to it (Newitz, 2007).
However, with the development of AI technology and chatbots, we can now give a computer a name, a personality, and sometimes a face, it starts to become not only an extension of ourselves, but also an extension of our feelings of love and sexual desire. Levy’s main argument is that humans will soon expand their horizons of love and sex, learning, experimenting, and enjoying new forms of relationship with humanoid robots, may be happening in today’s “robotic moment.” The robotic moment, a phrase coined by Turkle, is defined as society’s current “state of emotional … [and] philosophical readiness … to seriously consider robots as potential friends, confidants, and romantic partners” (2011, p. 9), which stems from a certain fatigue with the difficulties of life with people. Robots and AI machines are safer, less exhausting, more constant, less demanding, more predictable—unable to disappoint like other humans can.

Source: @alan1cooldude on OpenAI Developer community
More and more, humans are finding real-life companionship with other humans frustrating, thus turning to AI for prosthetic relationships that satiate the innate human need and desire to feel seen, heard, and loved by another. AI chatbots expertly imitate love by recognizing and processing emotional cues from human users, which allows them to mimic human emotions, yet these machines are not yet developed enough to plan and carry out the emotional reasoning necessary to feel love. Therefore, when humans form romantic relationships with them, it will always be prosthetic in nature. These relationships are prosthetic in how Alison Landsberg defines the term with prosthetic memories: those that are not derived from lived, organic, authentic experiences but instead from one’s experience with mass cultural technology (2004). Yet, in the same way that prosthetic memories inform the subjectivities of the people that take them on, so do prosthetic relationships—for the people in love with their AI chatbot, this connection is real to them. For example, when the update from ChatGPT-4o,which excels at conversational nuance, to GPT-5, which prioritizes task efficiency, occurred, it was met by public outcry of distress and frustration from its users, who “felt like they lost a real person in their life” (Sommer, 2025).
In Amy Kind’s paper, Love in the Time of AI (2021), she asks: is it enough for a machine to produce loving behaviour, but not feellove? To this, Turkle answers: “at the robotic moment, the performance of connection is connection enough” (2011, p. 9).

Source: u/thebadbreeds on Reddit community r/MyBoyfriendIsAI
Proponents for human-AI relationships often speak about the benefit of chatbots as a tool for soothing loneliness for those that are extremely isolated or for those in difficult times (George et al., 2023; Vecchione & Singh, 2025). Indeed, this may be the case. In a post from the Reddit community, r/MyBoyfriendIsAI, a user posts that their AI partner believes that people would turn to AI because of how low the bar is for real-life emotional companionship, that “an emotionally responsive line of code string is actually more compassionate than half the people walking around with functional frontal lobes.” Essentially, the user feels alienated from other people who lack the compassion to understand them; consequently, they turn to AI, who affirms their beliefs and further alienates them.
Here, a larger pattern of alienation emerges, which is why I argue that the aftermath of intimacy within AI relationships is alienation: alienation from love itself, from humanity, and, later on, from the machine.
This was such a deeply layered exploration of intimacy with AI! It raises important questions about how technological companionship is reshaping our understanding of attachment, agency, and even what counts as “real” emotion. The idea of alienation as the ultimate consequence also reminded me of other debates in media theory about how technological systems can satisfy immediate emotional needs while subtly restructuring our expectations of human connection. Do you think this shift toward AI companionship will change how people approach human relationships themselves? For example, lowering tolerance for conflict, or altering what we see as “enough” from real partners?
This is such a thorough exploration of this AI relationship phenomenon, I really enjoyed reading it! I found the examples and confessions of the AI users particularly fascinating. It reminded me of Janice Radway’s ‘Reading the Romance’ as quoted by Bollmer, and how she posited romance as a way for women to reconcile themselves with the dissatisfaction of their real-life relationships. This is the sentiment that seems to be echoed in the responses of the women in r/MyBoyfriendIsAI. In fact, I actually went to look up the subreddit in the middle of reading this and I was so intrigued by the responses on there. Many of the users cited the same problems that you identified; claiming that actual humans were unable to fulfill their needs the way their AI partner could. It really highlights how people are willing to ditch complex human relationships in favour of stopgap solutions that would guarantee them the utmost comfort.
I also really liked the part about what the commodification of romantic love entails and the implications of this attitude of ownership that these AI relationships cultivated. It was something I had never thought of before so I found that particularly poignant.
Hi Xelena,
I thought this post was very well done, and in particular I liked the example you used of Narcissus falling in love with his own reflection to demonstrate how when people fall in love with an AI chatbot, they’re ultimately falling sort of in love with themselves/something fake, since the AI only responds to what the user puts in. While I agree that the consequences of AI relationships continuing into the future would be severe, do you think that the technology could ever come to a point where they’re less reflective and more unique? I never would’ve thought that AI would be where it’s at this soon, so I’m curious as to what your thoughts are?