Task 10: Attention Economy

Task 10: Attention Economy

 

 

 

 

 

 

Here’s a non exhuastive list of some of the dark patterns that I noticed in ‘User Inyerface”:

  1. “Click here to go to the next page” – difficult to know where to click due to the imagery, size and location of buttons
  2. Ridiculous password settings
  3. Double negatives “I do not accept the terms and conditions”
  4. A help bar that repeatedly popped up
  5. “Hurry up, time is ticking” stressing me out
  6. “Select 3 interests” and the location of the ‘deselect all’ button is difficult to find due to the way we read text

It went on like this for a while and I nearly ran out of the patience to complete the game. It’s clear that the motivation behind these deceptive practices is to wear you down a bit and cause you to reduce the amount of attention you are paying to what you are doing. They are also intentionally designed to cause us to take actions that we might not be aware of the consequences of. Since completing this game I have become hyper aware of these ‘dark patterns’ (Brignull, 2011) once during an attempt to cancel my doordash account, and again when attempting to order from the screens at a McDonalds in France. The latter experience made me realize how difficult these tactics are for people with low literacy (digital or print) levels / people who speak another language as we rely so heavily on the way that information is visually organized on the screen.  We are naturally inclined to click on the text that is highlighted in a green (or sometimes red) button. People with low levels of digital literacy are particularly vulnerable to these tactics as they may not be expecting to be deceived, might not understand the implications of their actions or be aware of the potential harms of sharing certain information or granting access levels.

In her TedTalk, Zeynep Tufekci made a compelling argument about the dangers of persuasive technology when she noted that it seems relatively innocuous in the beginning (being inundated with the same product you were searching) until we gain an understanding of the ways that algorithms are shaping and controlling our perceptions of the world around us (Tufekci, 2017). The internet can be a difficult place to regulate, and it is even harder to decide at which point these persuasive architectures shift from being simply persuasive and cross over into manipulative territory. As both Harris (2017) and Tuekci (2017) noted, the intent behind these structures were not to be unethical or cause harm, but there are a variety of unintended consequences. Clearly, we need further regulation of the internet, but I think that change will be a much slower and difficult road than it has been to develop these technologies. This is in part because many people do not yet understand the ethical implications of these attention technologies.

References:

Brignull, H. (2011). Dark patterns: Deception vs. honesty in UI design. A List Apart, 338.

Harris, T. (2017). How a handful of tech companies control billions of minds every day [Video]. TED.

Tufekci, Z. (2017). We’re building a dystopia just to make people click on ads [Video]. TED.

 

 

 

Leave a Reply

Your email address will not be published. Required fields are marked *

Spam prevention powered by Akismet