Week 10: Attention Economy

Posted by in Tasks

This game was maddeningly fascinating and entertaining to play. Once I accepted the fact that I was Alice in a digital Wonderland, I found myself amused and irritated by the gimmicks that were designed to confuse or mislead me. But the most alarming part was the artificial urgency created by what Zeynep Tufekci (2017) would refer to as the “persuasion architecture” of the game design. I was frantically trying to enter correct information about my identity while dodging intrusive pop-ups and a timer that set the pace and tone of the experience. I truly was the user rather than the person in my attempt to complete the task. This prompted me to not worry about accuracy of information unless it inhibited me from proceeding. It was strangely refreshing to have to think about the components of my email address, for example. This experience feels like it qualifies as satire or farce, as it provides such an absurd experience of something we now take for granted. The satirical joke is on us for playing the game on the designer’s terms in the first place.

If this is the inversion of a typical online experience, the real thing is distressing in its convenience and persuasive power. When I consider, for instance, what it would mean to design a site that deliberately tests a user’s preferences or limitations and captures that data, it isn’t hard to see what potential that data could hold for marketing purposes. It would be interesting to look at how long someone would stick with a particular task (I was a stubborn, slow learner on the last one) to optimize it, as Snapchat has done to create a kind of loyalty beyond reason with its Snapstreaks (Harris, 2017). User Inyerface is arguably a step in the direction of humanistic-driven design that Harris (2017) refers to, ironically because its deliberately confusing design makes us aware that a group of people has constructed this intentionally.

While it is not hard to imagine, as Harris does, what human-centered designs could look like and do, it’s alarming to acknowledge how challenging the task of regulating or even holding accountable the corporate entities that, as Tufekci (2017) and McNamee (2019) point out, are simply motivated by the profitability of their business models. It is this absence of ethical consideration that invites Facebook’s engineers to constantly refine the way its algorithms learn to “nudge” users. It was strange to realize at the end of the game how much I had learned to adapt to the manipulations and regulate what I was doing. I felt like I had figured it out. In the end, however, I was the one who refused to give up until I had completed the task, just so I could be told I was awesome. Thanks, Carlton!

 

References

Harris, T. (2017). How a handful of tech companies control billions of minds every day. Retrieved from https://www.ted.com/talks/tristan_harris_the_manipulative_tricks_tech_companies_use_to_capture_your_attention?language=en

McNamee, R. (2019, January 28). I mentored Mark Zuckerberg. I loved Facebook. But I can’t stay silent about what’s happening. Time. Retrieved from https://time.com/5505441/mark-zuckerberg-mentor-facebook-downfall/

Tufekci, Z. (2017). We’re building a dystopia just to make people click on ads. Retrieved from  https://www.ted.com/talks/zeynep_tufekci_we_re_building_a_dystopia_just_to_make_people_click_on_ads?language=en