Around 2019 I was having a very hard time with how I used my computer and phone. I kept spending time on sites that I did not want to spend time on, felt feelings that I did not want to feel, and found myself giving advice or arguing to people I only knew tangentially. As such, I decided to severely limit my screen time – which became much harder than expected as I found it very difficult to find a happy medium between either blocking a particular app (like Facebook or Instagram) or simply typing in my own password to extend my screen time limit. Around this time I asked my wife to put a screen time password in my phone and limit my social media time (Facebook, Instagram, Reddit) to about 40 minutes a day cumulatively. Additionally, I unfollowed all of my friends on Facebook and my newsfeed is only ads. Every once and a while I’ll just click “uninterested” on every add I see until my news feed runs out. It’s true! There is a bottom of the news feed. Even with this limited access to certain sites, an increased knowledge of my own psychology and triggers, and a personal desire to avoid these sites – it seems like I’ve come out at the best medium which is a draw. I don’t feel like I have control over either, just a forced stalemate. The User Inyerface game was very frustrating and difficult to navigate and reminded me of many online situations that I’ve been in before. I have a decent add block on my computer and whenever I go on a different computer I’m always shocked that “this is the internet”.
The altruism of Tristan Harris is both encouraging and demoralizing. As we’ve seen throughout this course, there is a heavy correlation between economics and text technologies. The economics of the internet is advertising so the semiotic domains of using and creating the internet are those of advertising psychology. This is quite demoralizing because large systems tend to take on lives of their own. I was recently listening to a podcast called Homebrew Christianity where the host and guest discuss French philosopher and sociologist Jacques Ellul’s ideas around technology and society. Without getting into too much detail, they discuss the power dynamics of technological innovations and how it compares, via religious metaphor, to “idolatry”. I found this comparison to be quite poignant. Idols are something that we give sacrifices to with the hope that it will give us some kind of power in return (knowledge/predictability/wealth etc). Fuller states that one of the major themes of humanity is the predicament that thinking that “possessing power is ultimately our solution [which] generates all kinds of death” (37:40). There is a viscous cycle of individual, collective, economic, and environmental “sacrifices” that we make for the power that technology supposedly affords us. In addition to the massive power imbalances between those who generate technological systems and those who use them there is also the negative feedback loop of looking to technology to fix the issue of technology. Ellul has a rather complicated concept that he has coined as “technique” which refers to as “the totality of methods rationally arrived at and having absolute efficiency” within a technology, procedure, machine… I have come to understand it as absolute efficiency of method within a semiotic domain (1964). Within this concept Fuller and Morelli discuss the “techniques” of technology and economics and how we often look towards a particular technique to solve the problem of the technique. We look to technology to solve the problem of technology. We look to economics to solve the problem of economics. Or look to politics to solve the problem of politics. All of these techniques promise power and we make sacrifices to ensure that our team, nation, ideology, system controls the techniques.
I have listened to more than one podcast on the future of AI and many of the individuals who are teaching on interdisciplinary intersections of AI suggest looking outside of a particular technique to fix the problem. Looking to spirituality, or relationally and community driven religions or organizations, getting back to caring for your neighbour and the land… I suppose those are the ways that I am looking to negotiate the blackhole of add driven technologies.
Ellul, J. (1964). The Technological Society, trans. John Wilkinson, A Vintage Book.
Fuller, T (Host). (2024, July 14). Jacques Ellul & the Technological Society with Dr. Michael Morelli [Audio podcast]. https://trippfuller.com/2024/07/14/michael-morelli-jacques-ellul-the-technological-society/
Joti Singh
July 26, 2024 — 10:13 am
Hey Jonathan,
Thanks for sharing your experience. It sounds like you’ve been on quite a journey with managing your screen time and digital habits. I can totally relate to the struggle of trying to find that balance and control over how we use our devices, I find as time passes my connection to my devices lowers as well.
Your decision to limit screen time and unfollow everyone on Facebook is impressive! I think it’s a powerful step to reclaim your attention and reduce the influence of these platforms on your daily life. I get what you mean about feeling like it’s just a forced stalemate, though. Even with all the strategies and tools at our disposal, it can still feel like we’re just holding the line rather than truly winning the battle.
The User Inyerface game really brings this home. It’s a reminder of how intentionally frustrating and manipulative some online experiences can be. It’s designed to make us aware of the tactics used to keep us engaged and clicking, much like what you described when you notice the difference on a computer without ad blockers.
Reading your reflection, it’s easy to see how we can get caught in a cycle of looking to technology to solve the problems created by technology, which often leads to more sacrifices and imbalances. Your point about looking beyond the typical solutions to address these issues is very thought-provoking. Embracing spirituality, community, and relational approaches can offer a refreshing perspective and might be what we need to break free from the blackhole of ad-driven technologies. It’s a reminder that sometimes the answers lie outside the systems that created the problems in the first place!
jonathan tromsness
July 29, 2024 — 3:37 pm
Thanks for the reply Joti,
It’s interesting how this course has helped me reflect deeper on my technology use. It’s brought some words and categorization to feelings I experienced. Having the language and research to back up and justify somatic and emotional experiences is half the battle! I’m hoping I can help and learn from my students in this realm.