I chose to link to Isabella’s post because I admire her decision to stop before reaching the end. I wouldn’t call it quitting, as that carries a negative connotation. Instead, I respect her for recognizing that the task was a waste of her time and consciously choosing to use her time more productively.
Isabella mentions that she felt manipulated into making mistakes, and while this was a harmless exercise, it highlights a larger issue. People are often tricked—not just into wasting time and feeling frustrated—but also into wasting or losing money.
Reading Isabella’s response, I realized that I shared many of the same thoughts and feelings. I also appreciate how she compared this task to an escape room—because, in many ways, that’s exactly what it felt like. But navigating the web shouldn’t feel like solving a puzzle. This raises an important question: Who gets to decide what an intuitive interface looks like and why is it considered intuitive?
Additionally, after experiencing this task and exploring the Deceptive Patterns website, I wonder—can these tactics be regulated? And if so, who should be responsible for protecting users from them?
References:
Brignull, H., Leiser, M., Santos, C., & Doshi, K. (2023, April 25). Deceptive patterns – user interfaces designed to trick you. Deceptive Design. https://www.deceptive.design/
I felt like this was an exercise in what not to do as a designer to hold people’s attention, as I suspect most users would give up on this task and move on to something else. While completing it, I realized that I likely had an advantage over novice technology users. For example, on the landing page where you need to click ‘here’ to go to the next page, I knew to move the cursor around until it changed and revealed the link in the bottom left corner of the browser. I couldn’t help but think that if my mother were trying to navigate this site, she would call me, as I doubt she could get the page to advance. After this first page, I quickly realized that this site would not follow the typical “rules” users are accustomed to. For instance, the link wasn’t the expected underlined or blue-colored text but rather the actual word ‘HERE.'”
The most frustrating page for me was the second one, where I needed to create a password. The overstimulation from the red and green colors, the constantly changing numbers counting 1, 2, 3, 4, and the confusing wording was frustrating enough. However, what aggravated me was knowing how a page should work but being forced to relearn new (and incorrect) rules. The worst part was figuring out that I had to click on the terms and conditions to accept them in a different window. I suspect this entire exercise gives proficient users a glimpse into the frustration that novice users often experience.
On a broader level, this task made me reflect on how I have been conditioned through repeated exposure to navigate websites and apps intuitively. This conditioning is so ingrained that I felt frustrated when things didn’t work as expected. I suspect this is why people don’t like updates – they often change how basic tasks are accomplished, and many people dislike change. However, I approached this task like a game and I was determined to “beat” it. I relied on trial and error to work through the system, knowing it was intentionally defying effective user interface principles. This realization helped me persist. I also understood that a well-designed interface is intuitive, aligning with users’ natural expectations, but this one did the opposite, so I knew to experiment with unexpected solutions.”
Through this entire process, I felt like I was being manipulated because if I wanted to reach the end, I needed to play by these new rules that I didn’t particularly agree with. In the documentary The Social Dilemma Tristan Harris states, “If something is a tool, it genuinely is just sitting there, waiting patiently. If something is not a tool, it’s demanding things from you. It’s seducing you. It’s manipulating you. It wants things from you. And we’ve moved away from having a tools-based technology environment to an addiction- and manipulation-based technology environment. That’s what’s changed. Social media isn’t a tool that’s just waiting to be used. It has its own goals, and it has its own means of pursuing them by using your psychology against you.” (Orlowski, 2020, 30:11) While this task was just a simulation, I couldn’t help but notice its manipulative design. For example, why was I required to upload an image? How many people were tricked into downloading an image by the large blue ‘download’ button, rather than noticing the faded ‘upload’ text buried in the instructions? This is an example of a deceptive pattern, or what Brignull et al. (2023) call a dark pattern. There were also several points where a ‘skip’ button or alternative options should have existed, such as for the image upload and the ‘pick three interests’ section. Additionally, the forced title selection—where users could only choose between ‘Mr.’ or ‘Mrs.’ and then had to pick the opposite gender to proceed—felt unnecessarily restrictive and frustrating.
This felt particularly deceptive.
This experience was a stark reminder of how much we rely on intuitive design and familiar digital patterns to navigate the online world. When those expectations are disrupted, it creates frustration, confusion, and, in some cases, an insurmountable barrier for less tech-savvy users. The deceptive elements, dark patterns, and unnecessary roadblocks highlighted how designers can manipulate user behavior—especially over time—by conditioning us to act in certain ways. This also serves as a reminder that users must continuously strive to improve their tech literacy to avoid falling into the traps of deceptive or dark patterns.
References:
Brignull, H., Leiser, M., Santos, C., & Doshi, K. (2023, April 25). Deceptive patterns – user interfaces designed to trick you. Deceptive Design. https://www.deceptive.design/
Orlowski, J. (Director). (2020). The social dilemma [Film]. Netflix.
To better understand the visualization, I first highlighted the source nodes—our names—which allowed me to distinguish between curators and song titles. I then rearranged the nodes by spreading out all the curators in a U-shape. This made it easier to see which songs were the most popular based on node size and which songs had fewer selections.
By moving a specific track node, I could observe how many and which people had chosen it. For example, Track 13 had five people select it.
When I slightly offset Track 24 next to Track 13, I noticed that four of the five people who selected Track 13 had also chosen Track 24. However, the visualization does not provide any indication as to why these selections were made.
I found it challenging to interpret connections with so many nodes and edges. It would be helpful if the visualization incorporated color-coding. For example, if selecting a source node changed its color, along with all its corresponding edges, and a different source node triggered a different color, patterns might be easier to identify.
In community one, I used a similar method of node arrangement and observed that all five curators in the group had chosen Track 7. Additionally, several other tracks were selected by three or four of the five curators. However, again there is no clear way to determine why they made these choices. Is there a common factor among these curators that influences their selections? Another important question that the given visualization cannot answer is whether all curators chose a particular track for the same reason. In fact, there may be no real similarity between these curators if they all selected the same track but for entirely different reasons.
This dataset contained 27 tracks, and all of them received votes, with the least popular having two votes and the most popular having 15. However, if some tracks had received no votes, they would not appear in the visualization at all. This absence could prevent viewers from considering why those tracks were not chosen.
Ultimately, the visualization provides only basic connections without deeper explanations. With additional data—such as age, gender, location, or musical preferences—it might be possible to identify relationships between song choices and external factors. This highlights why so much data is collected about us. More data enables a deeper understanding of connections and can be used to influence behaviors, such as purchasing habits or even political opinions. If advertisers can determine what persuades one group to click on a link and buy a product, they can apply similar tactics to other groups with shared characteristics.
It is unsettling to consider how our thoughts and decisions are influenced—or attempted to be influenced—without our awareness, shaped in part by our network connections.
I connected with Quinn’s reflection on Task 5, especially the final paragraph, because it resonated with my own experience as a grade school student reading Choose Your Own Adventure books. Like Quinn, I find comfort in reading a book from beginning to end, knowing the story will unfold in a predetermined way and that I won’t miss anything..
Reflecting on Choose Your Own Adventure books and exploring the Twine projects led me to think of other video games. After completing ETEC 544 Digital Game and Learning and exploring The Witcher 3 video game, I continued to play it beyond the course. The Witcher 3 is highly narrative-driven, with a complex storyline in which your decisions as Geralt influence the plot, character interactions, and available choices. This mirrors real-life decision-making, where choices shape our paths in ways we can’t always predict, however, I often wonder whether I am missing something in the game due to a path I have taken.
Reflecting further on hypertext and the web, I recognize how often I feel a similar sense of missing out. The sheer volume of available information can be overwhelming when researching a topic. Unless I focus my attention, I struggle to process gathered information, constantly wondering what I might be missing. This illustrates how technology has transformed information-seeking behaviour. As a middle school student, my research process was straightforward—I consulted the World Book Encyclopedia, followed references to related entries, or sought guidance from the school librarian, who would direct me via the Dewey Decimal System to a specific yet small section of books. Today, when my 15-year-old son conducts research, he enters a topic into Google and is confronted with thousands of results. This abundance of links and information presents both opportunities and challenges, making digital literacy an essential skill for today’s students.