What If … ?

Posted by in Uncategorized

Dr. Vallor has said that we can improve our machines by improving ourselves. She might also have said that, conversely, we can improve ourselves by improving our machines. Speculating on possible futures to help us design the present are especially important with regard to AI and algos since the future world they help us build will be based on what we define for them as success today. Algos gone awry that lead to negative feedback loops could just as readily work in the opposite direction, virtuous rather than vicious circles, design tools that draw us constantly towards a better, fairer, more just society. As noted, the utility of speculative fiction lies in its power to infer the future from past referents. As a novel (in both senses) form of critical discourse, Dunne and Raby (2013) identify two essential criteria for meaningful speculative narrative: a) that it be scientifically possible and that b) there be a clear path from here to there.

Dystopias wrought by Trump-style right-wing populists or sociopathic corporations are likely to be popular so my fail-scenario takes another perspective. The catalysts were two events that occurred recently in higher education each representative of now deeply-entrenched processes. The first was the removal of Chaucer from the curriculum of the English department at the University of Leicester to be replaced by critical race theory. The second was the denunciation of Western musical notation by academics within the Faculty of Music at Oxford University early this year. As a coda, in light of this, Oxford has issued a public statement that there are no plans, at present, to ban sheet music from the faculty and that any claims that this is imminent are false.

The win-scenario follows a conversation in which the policy recommendations made by Drs. O’Neil and Vallor are fully implemented. The catalyst for this was a conversation I had with my partner about some of the issues that had come up in the course.

 

When it doesn’t go well …

In 2025, the newly-elected Labour government passed the Safety in Education Act. This legislation was in direct response to mounting student outrage over the Landmark Report which, since its release in 2021, had triggered a series steadily more violent protests throughout the United Kingdom. The act authorized the creation of an A.I. “head dean” whose function was to assist administrators in British education with the urgent task of creating safe sterile learning spaces in the nation’s institutes of higher learning. Fearing hostile media reaction, the algorithm was designed in secret by the ministry to deter unwelcome scrutiny. This super algorithm was called PurityControl.

“The algo came into use in my second term at uni and there was kind of a buzz about it at first, being a new thing and kinda weird. It would make everything better they said and calm everything down. With the riots and the kid who caught killed in the protest last year. I guess something had to happen. If the fucking fascist cops took a step on this campus it’d blow.”

“Nobody actually called it PurityControl, of course. Kids called it Pucker, and then some jocks in Athletics started with Mother Pucker, so immature but typical, and then finally it just became Mother. And that seemed right.”

“I didn’t even know what it was a first. I just heard someone talking about this new software that was going to fix everything. You know, make everything right. Seemed daft to me. Bet Nigel a tenner it’d be out before Christmas. Bloody daft”

It’s been here all the time. Ubiquitous and pernicious. Seeping into their minds. Conditioning every thought. Silently, insidiously encoding oppression. A threat subtle but clear. Action needed to be taken. Containment.

Mother has identified Latin alphabetic script as carrier of linguistic structures that centre and privilege 
Western cultural imperialism. This primary source code has been identified as a product of and facilitator
for, wide-spread systemic oppression and as a pernicious tool of conceptual and ideational violence. Mother
will neutralize the immediate threat by the purification of all material containing Latin alphabetic script
from British campuses. 

Mitigation protocols will be activated for those who feel they have been traumatized.

“They closed the libraries in March and started taking the books out in early April. Don’t know where, somebody said a recycler in Manchester. We didn’t have student elections that year either but it was really no big deal. There wasn’t really much point as no one had anything much to say. It had got so quiet. No one needed a vote anyhow. Mother told us what we should do. We didn’t need any bogus so-called “free speech” bullshit either. Mother told us what to say. And no spurious “points of view” or “dissenting opinions” from a bunch of god damn racist asshats. It was better now. Mother told us what to think. Mother always knew. Mother kept us pure. Mother kept us safe”

“I always said it was daft. I don’t say it now mind …… Not a good idea to say it now.”

Epilogue

It was felt within the ministry that it would ease compliance with controversial directives if PurityControl could be given a friendlier more human interface. The then-Minister of Education, the right Honorable Diane Abbott, volunteered herself for this role, an offer that was immediately accepted by her colleagues given the warmth, urbanity and tolerance for which this zealous servant of the public was renowned. The voice, image and personality of PurityControl took on the living form of the minister and the result was judged a singular success.

 

When we get it right ….

“All is for the best, in this best of all possible worlds”

– Leibniz

“You’re full of BS Leibniz. Let us cultivate our garden”

-Voltaire (paraphrased)

Gary:  It’s like that book, you know, that one that our English professor told us about, The City of the Sun by Tommaso Campanella. Where the citizens figure it all out and make this amazing place where everything works and everybody’s happy.

Bruce: Not everything. Everything doesn’t work and not eveyone is happy. There are still a lot of problems but yeah okay, it’s pretty impressive what they’ve done.

Gary: Hey, remember, what we did. We did have a part in this ya know. I mean who built them? Who wrote the code? Don’t think it’s all them.

Bruce: It was Linus Torvalds who really started it, ya know. The idea that essential knowledge should be a public good. That what benefits us all and affects us all should be created and held in common.

Gary: Yes, but it wasn’t easy getting corporations or individuals for that matter to give up on the idea of proprietary code. The users were key. Everyone was so pissed off at the lack of control and flexibility. But then open source became a kind of mantra and it was like, it spread and spread until ….. Sort of an idea whose moment had come. It’s not just about software anymore. It’s about everything … All these groups and communities and people looking for a way to make good things happen, to break through the crap and talk and ….

Bruce: The algorithms made it possible in the end. They bootstrapped us really. Once we got that part right the rest just kind of fell into place.

Gary: We had to make sure the values we wanted preserved and maintained were there at the beginning. It was crucial people see that AI is not just a mirror. It shows us ourselves yes, but it had to know the selves we wanted to be as well as those we were. The software gets to write the next version of itself and again and again ..

Bruce: Losers too. It had to include the losers. Sounds harsh but it was all our nothing. We needed them to know that everyone had to be considered. When the European Union introduced the EU Algorithmic Commission in 2029 it served as a template for the rest of the world to follow.

Gary: You’re right. That really was the game changer. Best practices, rigorous regulatory oversight, careful monitoring of social outcomes, ….

Bruce: For example, think about Looking Glass. Think about the revolutionary impact that program has had on the entire technology. By turning the development process upside down. Instead of training algos on past data, now they’re trained on curated data, designed to simulate not only the world we had but the world we want. By identifying and eliminating unwanted bias at the beginning we start fresh, a new paradigm for a new world …

References

1. Dunne, A. & Raby, F. (2013). Speculative Everything: Design, Fiction, and Social Dreaming. Cambridge: The MIT Press. Retrieved August 30, 2019, from Project MUSE database