Task 12: Speculative Futures

In Task 12, we were presented with the challenge of creating two speculative narratives about our potential relationship with media, education, text, and technology in the next 30 years.

I have decided to present my speculative narratives in the form of podcasts. I have been listening to many audiobooks lately and have found them to be a good way to digest information on the go. I have also found that audio helps me better visualize the context and situation of a narrative.


  1. “Class Act” – An artificial intelligence (“AI”) solution for post-secondary students.

In my first speculative narrative, I explore “Class Act”, an artificial intelligence earbud that can tap into the brain waves as well as the auditory, visual, and tactile stimuli experienced by users to suggest and implement real-time learning interventions. Since Class Act is tapped into the brain waves of the user, the learning interventions can be presented in the form of visual or auditory resources or immersive simulations that consider the user’s contexts and backgrounds.


2. “Credit Note” – An artificial intelligence (“AI”) tool and algorithm that assesses class participation marks and tailors assessments based on student behaviour.

In my second speculative narrative, I explore “Credit Note”, an artificial intelligence tool and algorithm that collects student data from electronic devices and uses that data to adjust student participation scores and student assessments. This speculative narrative considers a situation where AI and algorithms go wrong.

1 thought on “Task 12: Speculative Futures

  1. SheenaChan

    Hi Richard,

    I found your podcast “Credit Note” interesting. It’s similar to one that I forecasted in T12, but you took a look into what would happen if it was taken in another direction, which is the danger behind any technology. Being able to forecast technology gone wrong is necessary since we’re ultimately responsible for our designs and creations. Some of the issues that Credit Note took care of, such as the sharing of answers and badmouthing classmates are issues I deal with, which is kind of tiresome, so I can understand the temptation to let a machine deal with it, and wouldn’t it remove bullying from schools? As tempting as it is though, I think teachers can do what machines can’t, which is model the expected behaviour. For example, instead of asking for the answer, student A can ask student B to watch A work out the question so B can point out mistakes in A’s calculations which A could then fix. Instead of badmouthing students, A could tell B how they feel and B can respond. And what happens if A badmouths B in therapy? Would Credit Note report and punish the student?

    Dr Vallor (2018) notes that AI technology has not reached the stage where it can function alone. I feel technology should be used to transform our classrooms, not replace teachers because just as teacher-student relationships are a crucial part of students’ learning journeys, technology is best used when there is human-technology interaction.

    References

    Vallor, S. (2018, Nov 6). Lessons from the AI mirror Shannon Vallor [Video]. YouTube. https://www.youtube.com/watch?v=40UbpSoYN4k&t=872s

    Reply

Leave a Reply

Your email address will not be published. Required fields are marked *