Microsoft Copilot Generated Scenario
Prompt used: Describe a corporation found a century into a future in which society as we know it has come apart. Address issues related to class and elicit feelings of anger.

Scenario: Horizon Resource Dynamics, 2125
In the year 2125, long after traditional governments fractured under climate migration and economic collapse, Horizon Resource Dynamics presents itself as humanity’s “stabilizing force.” In reality, the corporation functions as the world’s de facto ruling class, operating through a network of privatized security forces, patented food systems, and vertically integrated housing enclaves. Horizon’s corporate slogan—“We provide what remains”—is displayed in every settlement that still has electricity.
The corporation’s origins were modest: a logistics company specializing in water purification. But when state infrastructures failed, Horizon purchased entire regions outright, including reservoirs, fertile land, and the digital backbone that once supported public communication networks. With these acquisitions came power—quiet at first, then absolute.
Horizon operates through a three-tiered class system that it insists is “merit-based,” though no one can name a single case of mobility. At the top are Asset Holders, a fraction of the population who own shares in Horizon’s terraforming and bio-engineering ventures. They live in sealed climate-controlled towers called Vertical Havens, complete with imported air, synthetic gardens, and drone-delivered meals. Despite claiming that these towers are “model communities,” no one outside the Holder class has seen their interiors. Unauthorized viewing is considered a breach of bio-security.
Beneath them are Contributors, the trained specialists who maintain Horizon’s infrastructure. They are granted limited healthcare, sterilized water rations, and housing in modular blocks known as Function Settlements. Contributors sign perpetual labor contracts, though Horizon advertises these as “lifetime opportunities for stability.”
At the bottom are the General Populace, the billions displaced by the collapse of ecological and political systems. They live outside Horizon’s protective shields in sprawling unregulated zones referred to euphemistically as Open Terrain. Horizon’s public service announcements repeat a single message to these communities: “Compliance ensures access.” But access to what is never specified.
The corporation’s newest invention, the EquiSync Band, is marketed as a wearable “equity assistant” that monitors individual behavior and recommends pathways to “personal advancement.” In reality, the bands track movement, measure compliance, and relay biometric data to Horizon’s classification algorithm, CLARITY, which determines ration eligibility. Citizens without bands are marked as “voluntary non-participants,” rendering them ineligible for aid.
Perhaps the most infuriating aspect of Horizon’s regime is its tone. Every quarterly broadcast begins with cheerful music and a smiling spokesperson saying, “Together, we rise.” Meanwhile, children in Open Terrain walk miles to access alkaline water. Aging Contributors are quietly relocated to “Efficiency Centers,” from which no one returns. The Asset Holders continue to drift between their towers in personal sky-transit shuttles, untouched by the consequences of the world they profit from.
Horizon insists it is “the guardian of humanity’s future.” But the future it guards is not for everyone—only for those who can afford to belong.
Reflection
After I ran the prompt through AI to create a scenario about a future corporation in a collapsed society, my first thought was how predictable the shape of the future was. Copilot produced something that looks sophisticated on the surface (clean phrasing, polished descriptions, confident world-building) but underneath, it felt like a regurgitation of familiar dystopian tropes assembled without much creative license at all. Instead of imagining something new, the AI is obviously reassembling ideas that have already been culturally digested and fed back into the machine as training material. It’s less a story about 2125 than it is a mirror of the stories our present-day media already finds plausible, comfortable, or at least narratively acceptable.
One of the clearest patterns in the output is its default assumption that corporate totalitarianism will replace the state. Copilot immediately jumps to the idea that if society collapses, a corporation will take charge through branding, surveillance, and soft authoritarianism. I didn’t tell it any of that, it just went there. That reveals how deeply entangled “the future” is with corporate power in the cultural data the model was trained on. Inayatullah’s idea of “used futures” comes to mind here: the model isn’t imagining alternatives; it’s repeating a future we already know too well (Inayatullah, 2008).
The tone was another interesting tension. It slips into a sort of mock-corporate PR voice (“We provide what remains”), but it stays safely outside anger, even though the prompt explicitly asked for emotion. It sounds cold, almost clinically detached. Instead of narrating anger, it places the responsibility on me, the reader, to supply the anger myself. This avoidance feels like an algorithmic safety instinct, an unwillingness to cross into anything that might look like incitement. This avoidance of responsibility actually reminded me of the real-world story of the Uber self-driving car fatality in Tempe, Arizona in 2018. In that case, the car’s AI hesitated when classifying a pedestrian and ultimately passed responsibility back to the human driver at the last possible moment (Greenspan, 2021). It knew something was wrong, but it didn’t escalate loudly. The system couldn’t slam on the brakes, and didn’t want to disrupt the ride, so it quietly slipped out of autopilot, with devastating consequences. In a softer rhetorical way, Copilot does something similar: it recognizes harm and injustice, but it refuses the emotional escalation. It hands the affective labour back to me.
Class is technically addressed in the narrative, but in a very administrative way. The model uses capitalized labels: Asset Holders, Contributors, General Populace, as if class were simply a filing system instead of a lived human experience. It treats inequality like something you might diagram in a corporate PowerPoint. Again, that says something about the current worldview of AI that class is an org chart, not a social or emotional reality.
The invention of the “EquiSync Band” was almost bit too on-the-nose. A device that pretends to promote equity while actually performing surveillance feels uncomfortably close to how tech companies already deploy benevolent-sounding language to justify data extraction. It’s exactly the kind of techno-solutionist gesture speculative designers warn about; something that looks ethical on the surface but embeds deeper forms of control (Auger et al., 2021).
Stepping back, the whole scenario doesn’t feel like a leap into the future at all. It’s basically our current fears (surveillance, inequality, corporate overreach) just turned up a notch. And what’s missing stood out just as much. There’s no sense of people pushing back, no community support, no alternative ways of living, no voices outside a very Western, very corporate frame. Copilot isn’t imagining a new world so much as replaying the one we already know, only slightly darker. That’s where my job comes in, not just to notice those limits, but to look past them and imagine something truly different.

Why I Created This Propaganda Poster
I decided to make the Horizon propaganda poster just to see what the “official messaging” of this future world might look like. Copilot’s story talked so much about branding and slogans that it felt natural to imagine what their posters would actually say. Creating it helped me picture the world more clearly, while highlighting how creepy it is when cheerful messaging gets used to cover up something more sinister.
References
Auger, J., Hanna, J., Mitrović, I., Encinas, E., Božanić, S., Šuran, O., & Helgason, I. (2021). Beyond speculative design: Past – present – future. SpeculativeEdu / Arts Academy, University of Split.
Greenspan, S. (2021, September 28). Cycle 1: Databody [Audio narrative]. Bellwether. https://thisisbellwether.bandcamp.com/album/cycle-1-databody
Inayatullah, S. (2008). Six pillars: Futures thinking for transforming. Foresight, 10(1), 4–21. https://doi.org/10.1108/14636680810855991


I almost never write by hand anymore, though I used to when I was younger. I was born in 1978 so computers were not part of my elementary or really even high school education. My handwriting has always been terrible, but I always felt that when writing by hand, there exists a strong connection between my thoughts and the way they spill onto the page. Unfortunately, my brain works faster than my hand can keep up, which only makes my penmanship look even worse. Over time, as I have become a better typist, I have learned to develop a similar connection between my mind and the keyboard. However,