The Soft Violence of Convenience: On Siri, Low-Risk Intimacy, and Emotional Exhaustion

“To create ties, you must be prepared to cry.” — Antoine de Saint-Exupéry, The Little Prince

Introduction

In Sam Garcea’s post SIRI-OUSLY PERFORMING, the author offers a compelling reading of Siri through Bollmer, Verbeek, and McArthur, arguing that voice assistants do not merely represent femininity but perform it. Through their tone, politeness, and affective responsiveness, systems like Siri enact the gendered scripts of compliance and emotional labour that underpin contemporary service cultures. The author shows convincingly that Siri’s feminized voice is not incidental but part of a material performance that naturalizes hierarchy through design.  What I want to extend, however, is the other side of this relationship, the user. Author carefully analyzes what Siri does, but less so why people want Siri to do it. Focusing only on the device risks obscuring the psychological and cultural conditions that make such feminized interfaces desirable in the first place. Siri’s performances succeed not simply because its interface is engineered to signal femininity, but because users are already inclined to desire gentle, compliant, and emotionally predictable forms of interaction. The posthuman aura that McArthur describes: the sense that Siri is intelligent yet safely nonhuman, allows users to feel intimacy without vulnerability, and authority without guilt. In this way, domination is misrecognized as connection, and emotional labour is outsourced to an interface designed never to refuse, misunderstand, or judge. My response builds on the author’s analysis by shifting attention to this relational co-performance of gender. Rather than seeing Siri’s femininity as solely the result of technological design, I argue that it emerges from a broader cultural demand for low-risk intimacy, a condition theorized by Sherry Turkle, Maria Grazia Sindoni, and scholars of affective labour.

Power Masquerades as Comfort

While the author identifies how Siri’s feminized politeness enacts digital labour, I want to highlight the perceptual distortion on the user’s side:the way hierarchical power is reinterpreted as emotional closeness. As Sherry Turkle argues, relational technologies work because they “give the feeling of companionship without the demands of friendship” (Turkle, Alone Together, 2011). Siri’s posthuman aura, her tireless availability, emotional steadiness, and frictionless responsiveness, softens the user’s sense of authority. The interaction does not feel like issuing commands to a subordinate system; it feels like being gently accompanied. Jennifer Rhee similarly notes that anthropomorphized AI produces “affective camouflage,” masking structural asymmetries behind the fantasy of mutuality (The Robotic Imaginary, 2018). In other words, Siri’s design does not simply perform gender; it renders domination weightless. Users experience themselves not as commanding a feminized assistant, but as engaging in a benign, even comforting exchange. This confusion between emotional ease and ethical neutrality is precisely what allows power to pass as intimacy.

Emotional Labour by Design, Desire, and Delegation

If Siri’s appeal can be understood through Turkle’s notion of “low-risk intimacy,” Spike Jonze’s Her extends this logic into a full cultural diagnosis. Rather than treating Samantha as an example of increasingly “human-like” AI, I read the film, alongside Maria Grazia Sindoni’s work on technointimacy, as a study in how users outsource emotional labour to technologies designed to absorb it without resistance. Sindoni argues that contemporary users increasingly look to digital agents to perform “affiliative, therapeutic, and relational labour” that once belonged to human relationships (Sindoni 2020). This means that the rise of AI companionship is less about technological sophistication and more about a shifting cultural demand: people want emotional support that is consistent, inexpensive, and free of interpersonal risk. Samantha does not simply respond, she manages Theodore’s affect, anticipates emotional needs, and performs the labour of understanding without the possibility of withdrawal, boredom, or exhaustion.

Seen from this angle, Her is less interested in the evolution of artificial intelligence than in the evolution of human desire: a longing for intimacy without resistance, misunderstanding, or reciprocity. The film becomes a study not of machine humanity, but of our growing preference for relationships that require almost nothing of us. Samantha becomes desirable precisely because she collapses the costs of emotional reciprocity. As Eva Illouz reminds us, late-modern subjects increasingly navigate intimacy through the logic of consumer choice, seeking relationships that offer “maximum emotional return with minimal vulnerability” (Illouz 2007). Samantha embodies that fantasy perfectly.This interpretation shifts the focus away from the author’s claim that Her illustrates the expanding agency of feminized AI. Instead, it reveals that the real engine of the narrative is Theodore’s longing for a form of relationality that asks nothing of him, no patience, no negotiation, no recognition of another’s subjectivity. The appeal of Samantha, like the appeal of Siri, is not only that she is designed to serve, but that her service masks the asymmetry at the heart of the relationship. She performs emotional labour so gracefully that the user forgets it is labour.

Gender as an Interactive Script

When brought into conversation with Sindoni, Illouz, and Turkle, Her reads not as a narrative of digital transcendence but as a study of contemporary emotional exhaustion, of relationships outsourced to machines because the human ones feel too heavy. Users turn to machines not because machines have finally achieved humanity, but because humans have become uncertain, overburdened, and afraid of the costs of human-to-human intimacy. What Her seduces us with is not the promise of a loving machine, but the deeper desire that intimacy might someday be unburdened by effort, that emotional labour could be outsourced entirely, leaving only comfort behind.The rise of voice assistants reveals less about the intentions of engineers than about the emotional exhaustion of their users. As Eva Illouz writes, late modernity produces “emotional scarcity in the midst of abundance,” leaving people surrounded by connectivity yet starved for forms of care that do not demand more labour from them. This is why the relational loop between user and assistant feels so haunting: because it reflects not only technological mediation but a deeper cultural fatigue.

When Intimacy Forgets to Resist

In the end, what troubles me is not simply that technologies perform care, but that they have become the place where so many of us go searching for it. Siri’s gentleness feels effortless because nothing is asked of us in return; intimacy arrives pre-packaged, without the weight of another person’s needs. But this convenience has a cost. When a machine can soothe us instantly, human closeness, with its hesitations, its misunderstandings, its unruly demands, begins to feel unfamiliar, even excessive.So perhaps the more urgent question is not why we design technologies to simulate tenderness, but how our emotional landscape has thinned enough that such simulations feel sufficient. If emotional labour can be automated, if responsiveness becomes an endless resource, we risk forgetting that care is supposed to be reciprocal, difficult, alive. And maybe that is the quiet tragedy beneath all of this: not that machines are learning to sound human, but that we are slowly adjusting ourselves to relationships where nothing resists us, nothing pushes back, nothing asks us to stay.

Works Cited

Cameron, Deborah. The Myth of Mars and Venus: Do Men and Women Really Speak Different Languages? Oxford University Press, 2007.

Illouz, Eva. Cold Intimacies: The Making of Emotional Capitalism. Polity Press, 2007.

Jonze, Spike, director. Her. Warner Bros., 2013.

McArthur, Emily. “The iPhone Erfahrung: Siri, the Auditory Unconscious, and Walter Benjamin’s ‘Aura.’” Design, Mediation, and the Posthuman, edited by Dennis Weiss and Rajiv Malhotra, 2014, pp. 113–128.

Rhee, Jennifer. The Robotic Imaginary: The Human and the Price of Dehumanized Labor. University of Minnesota Press, 2018.

Sindoni, Maria Grazia. “Technologically-Mediated Interaction and Affective Labour: A Multimodal Discourse Perspective.” Discourse, Context & Media, vol. 38, 2020, pp. 1–10.

Turkle, Sherry. Alone Together: Why We Expect More from Technology and Less from Each Other. Basic Books, 2011.

Terranova, Tiziana. Network Culture: Politics for the Information Age. Pluto Press, 2004.

Verbeek, Peter-Paul. “Materializing Morality: Design Ethics and Technological Mediation.” Science, Technology & Human Values, vol. 31, no. 3, 2006, pp. 361–380.

Bollmer, Grant. Materialist Media Theory: An Introduction. Bloomsbury, 2019.

Written by Nicole Jiao

6 thoughts on “The Soft Violence of Convenience: On Siri, Low-Risk Intimacy, and Emotional Exhaustion”

  1. “Siri’s gentleness feels effortless because nothing is asked of us in return; intimacy arrives pre-packaged, without the weight of another person’s needs. But this convenience has a cost. When a machine can soothe us instantly, human closeness, with its hesitations, its misunderstandings, its unruly demands, begins to feel unfamiliar, even excessive.”
    This was so beautifully written! I’m really taken by the last part especially, where you argue that human closeness begins to feel excessive. I believe this desire for convenience may be the undoing of humanity when we apply it to connection, but I never thought of excessiveness as a tie-in to that.

    1. Hi Xilon !Thank you so much, that really means a lot. And I really resonate with your point about convenience being our undoing. I think what scares me is how quickly our threshold for “effort” in relationships shifts once we get used to these frictionless forms of comfort. If Siri or any AI can meet us without hesitation, without needing anything back, then the smallest human uncertainties pauses, mixed signals, the slowness of real emotion, start to feel overwhelming.
      What I was trying to get at is that intimacy has always required vulnerability, and vulnerability is by nature inconvenient. It takes time, patience, and sometimes discomfort. But when technology smooths all of that away for us, we start to forget that those “excessive” parts are actually the substance of connection, not the obstacles to it. So I really appreciate the way you phrased it: the desire for convenience doesn’t just simplify our relationships, it quietly narrows our capacity for them.

  2. Your focus on the user’s desire for feminized, low-risk intimacy was very thought provoking! Siri and Samantha feel comforting precisely because they offer connection without vulnerability or reciprocity. This frictionless intimacy shows a broader shift toward relationships that never resist us or ask anything back. This is the kind of emotional labour that is outsourced to machines, to soothe without demanding care in return. I think it raises the question, if as a collective, we’re becoming less willing to engage in the difficult, messy forms of human closeness that make real intimacy possible?

    1. I completely agree that Siri and Samantha feel comforting because they offer connection without any demand for reciprocity. And you’re right: that’s not just about convenience, it’s about a cultural shift in what we expect intimacy to be. What your comment made me think about is how easily “low-risk intimacy” becomes a template for all forms of connection. If an interface never resists us, never misunderstands us, never asks for emotional labour in return, then the unpredictability of real relationships can start to feel almost… outdated. Or worse, like a flaw rather than a sign of depth. So yes, I think your question is the heart of it. It’s not just that we’re tired; it’s that we’re getting habituated to intimacy that never pushes back. And once we’re used to that smoothness, the messy parts of human closeness, the misreadings, the negotiations, the need to actually care for someone, begin to feel disproportionately difficult. In that sense, the danger isn’t that machines soothe us. It’s that they slowly lower our tolerance for the very frictions that make intimacy real.

  3. This was such a powerful read and it left me thinking about our own role in shaping these low-risk forms of intimacy. It made me want to ask you something. Do you think people are actually choosing this kind of frictionless connection, or are we gradually getting trained into wanting it because everything else feels too exhausting? I was also wondering how far you think this desire goes. Is it mainly emotional fatigue, or is it also about control? Like, the comfort of knowing nothing unpredictable will push back?

    1. Hi Mio, Thank you so much, I really appreciate you sitting with the piece in this way. And your question is such a good one. I don’t think it’s a simple matter of people “choosing” frictionless intimacy; I think, like you said, we get slowly trained into preferring it because it’s built into the texture of our everyday life. When everything around us promises efficiency and ease, the emotional labour of real connection can start to feel out of sync with the rest of our world. I also don’t think it’s only exhaustion. There’s definitely something about control in there too. A machine won’t misinterpret us, disappoint us, or surprise us in the wrong direction. That predictability is comforting, maybe too comforting. It gives us connection without the risk of being pushed back, and I think that can subtly reshape what we expect from other people.
      So for me, the scary part isn’t that people want convenience; it’s that convenience quietly recalibrates what “normal” intimacy feels like. The more we rely on low-risk interactions, the more uneven and demanding human relationships start to seem. And once that shift happens, it’s hard to reverse.

Leave a Reply