Invasive technology is overtaking our homes and lives. Even though we may welcome technology, like phones, computers, smart watches etc… into our spaces and daily routine – that doesn’t change the fact that they can present major risks. Right now, there is a new form of digital literacy that is increasingly more challenging to tackle: consent and privacy of personal data. Partly, because surveillance technology is intended to be deceptive and the issues are complex and challenging for even the best researchers to comprehend. Further, The London Group notes the need to educate individuals on these changing digital landscapes, particularly focusing on the following realms of our existence, “our working lives, public lives (citizenship), and our private lives (lifeworld)” (New London Group, 1996, p.65). As the boundaries between these landscapes blur with integrations of advanced technology, our digital citizenship overlays our private world. So, while one might have a generally high level of digital literacy, one may still not know the relationship between technology and privacy. Modern digital literacy requires far more knowledge than knowing how to physically maneuver devices and their applications, it requires some level of understanding of broader socio-politic conditions.
The London Group identifies “the term ‘multiliteracies’ as a way to focus on the realities of increasing local diversity and global connectedness” (1996, p.64). These large tech companies have found their way into the corners of the home, of millions of users across the globe. The use of speech recognition technology is used as a method of creating audio scene analysis. Essentially, companies are using our at-home speech recognition devices (ex Alexa and Google Play) to collect auditory data and scrape for sounds beyond our device-directed speech (Turow, 2023). The collection of audio data can reveal intimate details about our identity and the dynamics of our closest relationships. Joseph Turow says in The Voice Catchers, “few of us realize that we are turning over biometric data to companies when we give voice commands to our smartphones and smart speakers, or when we call contact center representatives, but that’s exactly what is happening” (2023, p. 227). However, data is almost an inescapable byproduct of device usage and it’s deeply embedded into our social practices. Clearly, there is a need for users to learn about the collection of their personal data and how to protect their personal information.
In the following video, I perform an experiment with my viewers. The purpose of this exercise is to reimagine what’s in my bag and to understand how a soundscape without any speech, is still very revealing of a person’s space (or bag, as we see in this example).
I ask the listeners to listen to an audio clip of me riffling through my bag and perform an audio scene analysis. I subsequently share the audio with the video and ask them to assess if they were correct in guessing what the objects were.
Can my viewers tell what the contents of my bag are based on the sounds of the objects?
Some sounds are very telling of the object. Whereas, other sounds are more ambiguous and harder to pinpoint. The sound of a pill bottle is very distinct. You can hear me shake a pill bottle, followed by clicking sounds as I open the child safety lock on the cap. Then, a slurping sound. At this point, the listener is likely able to discern that I am drinking. One would likely assume that I am taking a pill and swallowing it down with some kind of drink. This sound could signify that I am not feeling well, or I’m taking some kind of medication. This pill bottle came out of my handbag, which I carry with me everywhere, so one could make the assumption that I take those pills often enough to warrant having them on hand. Through this analysis, one could make some assumptions about my health and well-being. Our health is an intimate topic we assume is only shared with those we entrust, but an audio clip of my bag reveals that that information is far more accessible than we might assume.
Hopefully, this exercise demonstrates that an integral part of digital literacy is knowing the risks of technology. We need to develop our education about technology to include a sense of self-awareness of how our technology collects our data and who has access to that data.
References
The New London Group. (1996). A pedagogy of multiliteracies: Designing social futures. Harvard Educational Review, 66(1), 60–93. https://doi.org/10.17763/haer.66.1.17370n67v22j160u
Turow, J. (2023). The voice catchers: How marketers listen in to exploit your feelings, your privacy, and your wallet. Yale University Press.