Spoilers for the show Pantheon and Upload
Throughout Allie Demetrick’s blog post titled, “ Pantheon: Authenticity, Perception, and Embodiment” there is an exploration of definitions and potential human authenticity of digitally uploaded consciousness. In this critical response post, I will be comparing Allie’s insights of the show Pantheon with various plot points from the show Upload. Thus, ultimately deriving a potential answer to Allie’s question of “if our consciousness is not attached to the material, what still matters?”
The Amazon Prime series, Upload, similarly to Pantheon, explores the implications of a digital afterlife, where human consciousness can be uploaded to a technological interface to elongate their life on Earth. Allie analyzed Walter Benjamin’s idea of aura and mechanical reproduction of which to upload someone to the digital afterlife, their physical body is destroyed. Thus, reproducing the human and destroying its natural aura. This manipulation of the natural to live in a simulated life, parallel to reality, is by definition means to live inauthentically. Thus, resulting in the conclusion that these “uploads” are not real, they are artificial experiences, mutable, and simulated.
The evidence of the extreme mutability of these digitized consciousnesses is the malleability of time, as described in Allie’s analysis. As George Orwell states in his book 1984, “Those who control the present, control the past and those who control the past control the future.” In Pantheon, Allie’s blog post describes time as flexible, where the perception of time can be manipulated. Digital humans can experience a year in a day or a day in a year. These false perceptions of time are evidence that these uploads have lost a stable perception of reality, thus having an artificial perception. The capitalization of time represents an extreme control of power over these digital people. This results in a complete loss of agency over the perceptions of the Patheon uploads’ environment.
In Upload, characters have a somewhat static purview of time. This point is emphasized by the lack of evolution the characters face. For example, a plot point in Upload was about a 10 year old boy, who had died and was uploaded. The boy’s parents decided to never upgrade the image of his body thus keeping his physical appearance as a 10 year old. In the show he had been uploaded for eight years, meaning his mental age was eighteen. The boy grew frustrated over his lack of growth and seeing his peers who were still alive pass him by affecting his mental health. This lack of autonomy over one’s own body resembles the character Claudia from the book Interview with a Vampire; an adult woman trapped in the body of a five year old. These characters grew distressed, angry, and discouraged about living, because there was no guaranteed end to their suffering or variety to their lives. There was no foreseeable change that they would experience physically and were surrounded by people who were growing older. Thus, the uploads were essentially objectified, expected to stay as they are. This led the capitalists of the series to strategize that these digital consciousnesses are just objects that can be used for their own gain, digital slavery.
Though I established that these digital uploads were not human because they lacked agency and evolution, I did not argue that they were not conscious. These digital consciousness have thoughts, and feelings that can develop relationships because they have the context and memory to grow them. This is one of the only real human characteristics these digital beings have. It is what makes people vulnerable to attachment to a simulated version of their loved one.
In the show Upload, there was an evolution of relationships, Nathan Brown (the main character of Upload) experienced a blossoming love story with a woman who was alive, Nora. Yet, his relationship was only tested when there was a risk of it being lost. Nathan’s consciousness was almost erased on several occasions. After each close-call or rebooted memory, Nathan always chose to love his partner, Nora, again. Xelena Ilon brought up a great quote in her final presentation that contributes to a definition of AI and consciousness:
“That’s what AI can’t yet offer: the friction that fuels growth, the silence that begs understanding, the feeling of being loved by someone who could walk away, but doesn’t because they choose you day in and day out.” – Cathy Hackl
Nathan not only fought for Nora when he was at risk of being lost, Nora fought for him. Their relationship was not a confirmation of Nora’s being or opinions, the couple grew to understand each other and truly love each other. Thus differentiating Nathan from AI.
So if these digital consciousnesses are not human, but not really generated AI, what are they?
Well, what does it mean to be a digital conscientiousness, are we still that person even if we are digitized? The aspect that makes a human conscious, human, is the mortality of consciousness. Mortality is what makes people human. A looming presence of death makes people want to live. In digital spaces that is not an expectation that is guaranteed. If one’s consciousness is digitized it is presumed that it will be there forever, or at least well beyond their kin’s lives. It is not until the digital landscape is at risk, that there is realization of mortality again. In conclusion, it is not the immateriality, necessarily, that makes a human experience not authentic, it is if there is a looming sense of death or a complete agency of one’s perception of their environment.
Citations
Orwell, George. Nineteen Eighty-Four. Penguin Classics, 2021.
Rice, Anne. Interview With the Vampire. Ballantine Books, 1997.
Love this expansion!! So interesting that the commodification and digital slavery aspects seem to characterize almost a genre of show; it seems like there are even more examples of this, but Upload and Pantheon are some of the few I’ve found that take place in America. I also really like your point about mortality and the “idea” of perception. I think it was Benjamin that alluded to the concept that perception only matters if you believe in what you’re experiencing. Simply put (for the sake of this argument), even if you were a digital consciousness, only your conviction of what you perceive matters, even if your experiences aren’t physically real. This could extend to the idea of mortality, where authenticity is granted “ if there is a looming sense of death or a complete agency of one’s perception of their environment. ” (My #1 Queen). Great post!
Hi Allie!
Thank you for allowing me to respond with such fervor and passion for the topic! In external conversation I had with you, I really questioned if uploading a human to a digital afterlife was our reality, how would I prioritize these lives over others? I am still not entirely sure of my answer to this question. Evolution, growth, and presumed mortality are all differentiations I would make between humans and digital uploads, yet their human likeness (especially if they were one of my own loved ones) would almost compel me to say these were humans just like me. However, part of me know they are machines, therefore if I had to choose to save a life of a human versus an upload, then I must say human. I just wonder why I can’t make a more definite answer to if these uploads are human or not.
Thank you for your post, comment, and listening to my non-answer! Great job 😀
Great blogpost, and so topical right now, especially with the use of AI to recreate celebrities now. Long-dead actors are being brought back to life through expert deepfakes to portray past characters back on the big or silver screen. This question about authenticity, materiality, and consciousness is so pertinent for this, especially with the additional layer of performance, as these recreations exist only to portray another character, and the actor no longer exists to enact their agency of the usage of their identity. What are the ethical implications of this? Furthermore, what are the consequences this has on media-making and the medium of performance?
(The full name-drop was a jumpscare, BTW.)
Hi Xelena!
The ethical implications of using a deceased-actor’s image for the portrayal of a former character of theirs is an interesting moral dilemma. I think the point of autonomy you bring up is so important! This deceased actor must have had a family member sign their likeness away. The filmmakers of these films must be doing these deepfakes in honor of the actor, however without their consent it feels like a violation masked with good intentions.
The show Upload has a very interesting dynamic with consent for digital uploads. For someone to be uploaded, they must be conscious and willingly submit to being uploaded. This is seemingly good, however many people who are being uploaded are on the brink of death, therefore under immense pressure for a very permanent decision. For example, the main character of Upload, Nathan, was about to be taken into emergency surgery after a deathly car crash and had to make the decision within the span of a minute to be uploaded or go under the knife for a life-saving surgery that was probable to fail. He ultimately chooses to be Uploaded, however it was revealed that surgery was more likely to succeed than was initially established, he was then outraged that he had the chance to live and yet was essentially forced to upload by external figures (his then girlfriend). Here is the clip! https://youtu.be/I0sXm08hoR0?si=De1hBJvq91IrztEC (TW Gore and mild nudity)
Overall, consent and coercion are a big part of AI and deepfake usage today. These examples in media give light to how these situations of pressure to use technology can affect others.
This was a super engaging read! This post really made me think about the delicate line between consciousness and identity. I enjoyed reading the comparison between Pantheon and Upload! It’s fascinating that these uploads can feel, grow, and form relationships, yet still lack the essential mortality that gives life its urgency. If being human is tied to mortality and limits, do you think a digital mind could ever develop its own sense of desire or purpose independently?
Hi!
Thank you for the kind words! To answer your question, if AI gets complex enough to the level of human consciousness (scary :’O) and recognition to mortality, I believe it would end similarly to the movie Her. If AI/digital minds ever found the beauty of human mortality then they would look for a finite end; making their relationship meaningful with others. My title reflects the sentiment that dying is good, because it means life is a thing and not just an infinite experience. People have to pick and choose what they want to spend their time on, and to them their life mattered. So, if a digital mind ever truly understands what human consciousness and life really is, then they would look for an end.
I really enjoyed reading this and it made me think a lot. Something I keep wondering is how you imagine living would actually feel from inside a digital body. Do you think a consciousness could ever experience time or relationships as real even if everything around it is simulated? Your point about mortality also made me think. If a digital mind could choose its own ending, would that bring back a sense of meaning or is authenticity already impossible once the physical body is gone?
I really appreciate how you complicate the usual “uploaded consciousness = AI” argument by grounding it in agency, growth, and the presence (or absence) of mortality. Your use of Upload alongside Pantheon makes the stakes so clear: without the ability to change, risk loss, or confront finitude, consciousness becomes a kind of suspended animation rather than a life.
Your point actually made me think of Ernest Becker, who argues in The Denial of Death that mortality is the central condition that gives human experience its urgency, meaning, and depth. For Becker, it’s precisely our awareness of death, that “looming sense,” as you put it, that pushes us toward love, creativity, and ethical choice. Reading your post through his lens really sharpened something: it’s not just that digital beings lack bodies; it’s that they lack the existential grounding that mortality provides. Without that horizon, their relationships and their suffering take on a strangely weightless quality.
I also loved how you framed the uploads as “not human, but not AI,” occupying this uncanny middle category where they can feel deeply but are denied the conditions that make feeling transformative. Your reading of Nathan and Nora’s relationship really highlights that distinction: love means something because it can be lost, because someone chooses it again and again in the face of vulnerability.
Woah, this was a super cool read! I’m not familiar with either of the shows but will be watching them over the winter break. There’s a lot of media that has touched on digital memory and consciousness, though you provide a interesting distinct that digital consciousness becomes inauthentic not simply because it lacks a physical body, but because it lacks mortality and agency. This leads to a sharper question that grows out of your argument:
If mortality and agency are what make experience authentic, could a digital consciousness ever reclaim authenticity by choosing its own limits—its own risks, endings, or boundaries?