I decided to put my Speculative Futures prompt into both ChatGPT and Copilot to see what each AI would come up with.
|
Prompt: Describe or narrate a scenario about a gift found a few years into a future in which society as we know it has come apart. Your description should address issues related to entertainment and elicit feelings of anger. |
|
|
ChatGPT |
Copilot |
| Three years after the blackout—the day the satellites fell and the networks died—Mara scavenged through the shell of what used to be a suburban multiplex. The smell of melted plastic and mildew clung to the dark. She kicked aside a collapsed popcorn machine and found it: a sleek VR headset, miraculously intact, still sealed in its glossy box.
For a moment, she imagined the world that had once obsessed over these toys—people paying more for digital fantasies than for groceries, governments funding corporate “experiences” while communities begged for clean water. She felt heat rise in her chest. Even now, when families were sleeping in gutted cars and cooking over garbage fires, this relic looked smug, like a monument to everything that had gone wrong. Inside the box sat a handwritten card: “For when you need an escape. You’ve earned it.” Mara’s jaw clenched. Escape was exactly what had killed them—everyone plugged in while the real world collapsed unnoticed. She hurled the headset at the blank movie screen, watching it shatter. |
The box was discovered beneath the rubble of what used to be a shopping mall, its wrapping paper faded but intact. A “gift,” left behind in a world where gifts had lost their meaning. Inside lay a sleek headset—once the pinnacle of entertainment, promising endless virtual worlds, concerts, and films streamed straight into the mind. But now, in the silence of collapsed society, it was a cruel reminder.
The device no longer worked; its servers had gone dark when the corporations fled, abandoning millions who had built their lives around digital escape. The gift mocked the survivors, who now huddled around fires for warmth, telling stories not for pleasure but for distraction from hunger. Anger surged at the thought: people had traded real connection for manufactured joy, and when the system collapsed, they were left with nothing but broken toys. The headset symbolized betrayal. Entertainment had been sold as salvation, a way to forget injustice, inequality, and decay. And when the world finally cracked, the “gift” revealed itself as hollow—an artifact of manipulation, a reminder that joy had been commodified and stolen. It wasn’t a treasure. It was a curse disguised in shiny packaging. |
It’s interesting that both AI-generated descriptions focused on a headset that lets people escape from reality. As VR technology becomes more accessible and more households adopt it, it’s easy to imagine that in a speculative future, these devices could become a source of anger. People might retreat into virtual worlds to avoid the challenges of daily life, creating a divide between those fully immersed in technology and those who resist it.
What also stands out is how each scenario imagines a dystopian future where technology contributes to society’s collapse. In both cases, the headset isn’t just a tool for entertainment; it represents a deeper dependency on technology that isolates individuals and erodes community and empathy. This reflects broader concerns about how real-world technologies can unintentionally cause social harm.
Even though the descriptions are interesting, the fact that they’re so similar says a lot about how AI actually works and content is produced. These models pull from the same kinds of data, patterns, and cultural ideas, so they often end up creating really similar outputs. This makes the imagined futures feel less like unique possibilities and more like variations of the same prediction.
In a speculative future envisioned by me, AI itself would be one of the major contributors to society’s downfall. It could influence decisions, twist information, and make social inequalities worse, showing that just because technology moves forward doesn’t mean society automatically does too. Sometimes it can even make problems worse.