Have you ever read the childrens’ books
These are joyful books to read with learners. The author invites you to press and mix the various images on the page to see if you can predict the outcome which is revealed on the following page. Much like how we navigate ourselves online, we make predictions and choices, doing what we believe is being asked of us assuming it will lead us with our desired result. Eventually we stop reading the cues, knowing we can make assumptions and it will work out okay.
However, ‘dark patterns’ or nefarious designs may manipulate users. The game User Inyerface reveals, how naive or passive we may be when engaging with the user interfaces. Below is a summary of my experience.
User Inyerface proved to be a frustrating. After much persistence, I threw in the towel when I finally decided I must NOT be human in the last (maybe it wasn’t) part of the game.
I realize how much I take for granted in terms of engaging with the user interface. From tapping on highlighted buttons without reading to clicking on texts that look like links, assuming that a blank field doesn’t require deleting before typing, and to just knowing that months will be listed chronologically.
We are trained to engage with technology and ‘read’ the text of screens in a specific way. We’re coded to click, skim, predict without critical thinking. The final struggle was the ‘checking you are human’ page. (Full transparency), I still have no idea whether it was a bow, as in tie or a bow like an encore, or if glasses referred to those you drink from or the ones you wear on your face… and don’t get me started on the circles! Moreover, were the photos correlated to the boxes above or below?
Not only are we trained with clicks, but with the directionality of how one reads a page and progresses through screens. The age slider is a perfect example because it was disorienting on the bottom left and not to the right of the screen. It was almost invisible until I had to search for it.
One takes from this activity that the way we ‘read’ or ‘decode’ in networked spaces is presumptuous. Predictable behaviour becomes a target for advertisers through ‘manipulated’ experiences. We take for granted that we are in safe digital spaces and that the defaults protect us. When, in fact, strategic and deviously designed interfaces capitalize on how one reads and engages with online texts.
Finally, knowing that one must be mindful of how data allows developers to design spaces leaves me curious about how human behaviour might translate to next-generation text and technologies? How might humans become savvier in navigating digital platforms? And finally, how might we establish safeguards to protect users from designers with ill-intent?