Linking Assignment: T11

 

I responded to Anu’s post because of her comment about algorithms need to be transparent and fair. What does fair mean exactly? Does it mean that everyone should be treated the same? If someone doesn’t show up for their hearing because they need to work to support their family, should they be treated the same as someone who doesn’t show up for their hearing because they were stealing to support their family? I wonder how an algorithm would process this? O’Neil (2017) mentions that the algorithm targets people in poverty. It’s impossible for an algorithm to consider all aspects of a person’s circumstances because at this time not all agencies are linked or even online (for example, hospital records). When an algorithm targets people based on their income or skin colour, it overlooks the underlying issues and perpetuates the biases that the algorithm was created with. I think algorithms can be seen as patterns that seek to repeat these patterns, which is why the same people keep getting pulled into this net. Judges in bail court have limited resources and time, if this is all the information you have, then is it fair to ignore it? Could algorithms be used for preventative measures instead so judges can properly listen to cases?

References

O’Neil, C. (2017, April 6). Justice in the age of big data. IDEAS.TED.COM https://ideas.ted.com/justice-in-the-age-of-big-data/

Linking Assignment: T10

I responded to Clarissa’s post because some things that she did, such as entering fake personal data, have become part of our daily lives. In the past child predators and kidnappers would want to know your name so they could pretend to know who you are and get you into their van or house. Now marketers and companies use your name to create a false sense of intimacy so they can get you into their stores, website and/or checkout counter. The occult could use your birthday and birth time to see your past and future, which is basically what the algorithms are, but they seem more insidious because they also use that data to manipulate your future decisions, which I suppose the occult could do as well, but the algorithms are everywhere and hard to ignore and so commonplace that they are sometimes unnoticeable.

Task 12: Speculative Futures

“Jay, why did you scream?”

“I don’t know.”

“Think about it.”

“I thought that was your job.”

“What was happening before you screamed?”

“Well, B screamed first so I decided to too.”

“If B jumped off a bridge, would you do that too?

“You sound like my teacher. This is really annoying; why do you have to go over everything I do?”

“Because you don’t.”

Dunne and Raby (2013) state that conceptual designs are more than just ideas, they’re ideals as well. Moral philosopher Susan Neiman elaborates on this, “Ideals are not measured by whether they conform to reality; reality is judged by whether it lives up to ideals” (cited in Dunne & Raby 2013, p. 12). An only child with busy career-oriented parents, their parents have become concerned about Jay’s emotional maturity. They have invested in a SmartWatch app that uses sensors to detect Jay’s emotional state and behaviour to prompt them to reflect and regulate his emotions and actions. Dunne and Raby (2013) note that designers’ creations are made with the best intentions, often neglecting people’s worst tendencies. Their solution to this design flaw is to change “our values, beliefs, attitudes, and behavior” (Dunne & Raby, 2013, p.2). The app’s algorithms used to determine behaviours and emotions are user-dependant, meaning Jay will be prompted by the app to input their emotions and causes.

Meanwhile, Jay’s teacher is talking to herself, or is she? She’s having a discussion with her teaching app. The app goes through the curriculum and student data. From the time students enter the school, their learning progress has been input into a database. This data plus lesson plans taken from several online resources are input into the app which then uses this information to plan lessons around the teacher’s learning objectives. Initially, teachers were concerned that their jobs and/or pay would be cut, but as Dr Shannon Vallor (2018) observed, “AI is not ready for solo flight” so “we [people] are still the responsible agents.” The purpose of this app is to free up time for teachers to have non-contact hours for evaluating student work during school hours, focusing on presentation rather than creation, mentoring and coaching new teachers, and of course, inputting students’ learning progress into the database. Teachers are still involved and engaged in the lesson planning progress since the lessons will use the data teachers input into the database to determine if the class should move on to the next learning objectives. Different activities and lesson formats are suggested by the app for teachers to choose from with the option to make modifications. The role of teaching is moving from lesson planner to evaluator and mentor. Dr. Vallor (2018) points out that the “future of human-AI partnership, one that serves and enriches human lives, won’t happen organically; it will need to be a choice we make, to improve our machines by improving ourselves.”

References

Dunne, A. & Raby, F. (2013). Speculative Everything: Design, Fiction, and Social Dreaming. Cambridge: The MIT Press. Retrieved August 30, 2019, from Project MUSE database.

Vallor, S. (2018, Nov 6). Lessons from the AI mirror Shannon Vallor [Video]. YouTube. https://www.youtube.com/watch?v=40UbpSoYN4k&t=872s

 

Task 11: Detain/Release

Everyone comes with their own biases and algorithms may be formed with the best intentions, but they are undoubtedly embedded with the bias of their creators. The biases I had coming into the detain/release module were

  1. I’ve watched every season of the Good Wife. I think in the last season the main character in bail court.
  2. I’ve watched most, if not all of John Oliver’s shows on prison and detainments.
  3. The song “What’s Your Story?” came to mind while I was listening to the different podcasts.

I don’t think this background makes me an expert, but this viewing history has made me more aware of biases, an unjust and prejudiced judicial and policing system where sometimes evidence is “lost” or “found” to get a case off the docket.

The charge is serious and sensitive. The information from the algorithm is confusing and makes me think of To Kill a Mockingbird where Tom Robinson was wrongly accused of rape. At the same time, there are plenty of reports in the news where rape victims’ claims were dismissed for no reason. I’d want to see the evidence they have against Alexander Dix. Why is he likely to appear and unlikely to be violent, but likely to commit a crime? Wouldn’t a person likely to appear also be unlikely to commit a crime? Did the algorithm predict he’d commit a crime because of skin colour? What kind of crime does the algorithm predict he’d commit? Shoplifting? Shoplifting is not something I condone, but why would he shoplift? It sounds like he has a family to support. If he is no longer there to support his family, isn’t it more likely that the people he left behind will suffer and perhaps resort to petty crime?

I think the problem with the algorithm is it overlooks underlying issues that need to be solved, such as over-policing in certain areas, lack of social support systems such as welfare and childcare, which O’Neil (2017) noted are related and results in the poor being caught up in “digital dragnets”.

It’s not just adults who can become entangled by these algorithms. LMS, which are becoming a part of more educational systems, can collect data from students that can be used to identify at-risk students. As a teacher I can use the data to confirm what I already know or look into cases that surprise me–perhaps a student who did well in class did poorly on a test due to a stomachache or an argument; however, if this data were sent to the head of school or the school board, the data would not show the wider picture that the classroom teacher is privy to. A report sent without teacher input could unnecessarily alarm and upset parents and students.

Algorithms do have their time and place. They definitely would have made my life easier back when I had 200 students during online classes, but I would have had to follow up with the students’ homeroom teachers to effectively use that data to evaluate and support students.

References

O’Neil, C. (2017, April 6). Justice in the age of big data. IDEAS.TED.COM https://ideas.ted.com/justice-in-the-age-of-big-data/

Linking Assignment: Task 9

Grant’s comment to my T9 post made me dive more deeply into the question I asked at the end of Task 9: Perhaps it would be easier to achieve diversity through universal feelings?

His comment made me curious to see his T9 post, which I responded to and posted below.

 

I think we should lean into our differences and try to understand them rather than ignore them.

In more detail, I’d like to talk about my shopping experiences in China. Sometimes the item I want is available but not in the colour I want. I used to say, “No” when asked if I’d like to purchase it in a different colour. Now I say, “That colour is not quite to my liking” because I noticed that the sales associate would giggle. At first I was surprised by the laughter–to me it made no sense because I had not said anything funny and I did not think my pronunciation was bad enough to cause laughter. Then I remembered a tip I had been given during my early days of living in China, people often laugh when they are nervous. By doing this I realized that my answer was too abrupt. I started paying attention to how locals interacted with sales associates and learnt how to say “no” in an inoffensive way. The problem with algorithms is oftentimes people don’t know much about the brains and biases behind the algorithms. What biases do they have? What data did they input and how did they interpret this input? How I interpret the data an algorithm comes up with will be different from how someone else would.

If my cultural blunders could be visualized as an image of me covered in question marks, would the number of question marks lessen as I interact with more locals and increase in number every time I move to a new region? Algorithms can both dictate and influence people’s decisions, but isn’t that the same as human interactions? The more often I interact with a group of people, the more I understand them and the more I might adjust my behaviour to better interact with them, but unlike an algorithm, I can find out the exact reasons behind their behaviours.

Dr Zeynep Tufekci’s (2017) TED talk discusses how the choices algorithms make can affect our emotions and political beliefs. I wonder if algorithms could be designed so people can explore differences like the op-ed section of newspapers? At the same time, I am worried about what I consider to be the darker side of our differences. Do I want to know why some people think the Holocaust is a hoax and why some people are, for example, anti-Semitics? I think it would be helpful to know their reasons so I can better address them, but I don’t want to be exposed to the vitriol that probably exists behind those reasons. If I don’t see that vitriol, could I be misled and misunderstand the impact of those feelings? If I were to design an algorithm that aims to provide a balanced argument, would I overcompensate for my biases and lean too far to the other side? Can technology distinguish between equality and equity? Perhaps they can sometimes, but not all the time. Technology should be something humans interact with, not something that replaces humans.

References

Tufekci, Z. (2017). We’re building a dystopia just to make people click on ads. Retrieved from: https://www.ted.com/talks/zeynep_tufekci_we_re_building_a_dystopia_just_to_make_people_click_on_ads?

Task 10: Attention Economy

 

Infuriating was my initial thought; I think my reaction was so extreme because of some things that happened earlier in the day. After completing the “game” I visited the bagaar.be website and read through their job application page and realized what a great ice-breaker this would be at their job interviews. Now I think the people behind this idea are creative and could be fun in small doses.

To start the game I had to read the text under the giant NO button. While HERE was capitalized, click was underlined, making it look like a link. I clicked on click first because I wanted to check to see if it was a link. It wasn’t a link so I had to click on HERE. Often websites will make the surrounding words a link as well to make it easier for users and in consideration of any missed clicks that can happen. This process to start the activity reminded me of the attention and time needed to subscribe and unsubscribe for paid subscriptions.

A pop-up reminding me time was passing took several tries to close. The option to close the pop-up was written in a small font in a light colour and it could be easily missed when scanning because it looks like a copyright line. I thought this pop-up design effectively took into account the Steve Krug’s psychological insight shared on Brignull’s (2011) article, “[w]e don’t read pages. We scan them”. The time pressure can make people skip or rush through the fine print and click on buttons without knowing what they’re agreeing to.

Another psychological insight Brignull (2011) mentioned is by Robert Cialdini, “[p]eople will do things that they see other people are doing”. Normally I would not bother completing a “game” like this. Even though the assignment says that the “game” does not have to be completed if an explanation is given, I chose to complete the activity. I could have written about my own experiences with dark patterns, but I thought that most of my classmates would complete the “game”.

Before completing this task and after reading about forced continuity practices (Brignull, 2011), I remembered a subscription that I should cancel. To get my cancellation processed, I was asked to confirm my cancellation. The Yes option to cancel was in red on the left and the No option to stop the cancellation was in green on the right; however, I’m not sure if I should describe this company as deceptive because they did send me a reminder email a few days before my free trial ended. They could have set up the options that way to avoid accidental cancellations, but resubscribing should be easy.

Prior to reading Brignull’s 2011 article, I saw dark pattern practices as part of everyday life. I think most organizations try to mislead customers, just like stores hike up prices before sales or make cheaper quality clothes to add to their sales and outlet stores. Just because these practices are common, it doesn’t mean that they’re acceptable.

Dark patterns can go beyond generating revenue; they can be used to drown out honest voices. I was looking for free AR apps to use in an educational setting and it seemed that the completely free AR apps recommended by educational blogs and educators online were out of business. Having never used them, I can only speculate what happened. Perhaps they used honest design in their UI which resulted in lower revenue and less advertising. If honest UI design leads to not having a voice, should honest companies resort to dark patterns to balance the field? Brignull’s (2011) Expedia.com example shows that companies can drop dark patterns, but would they have reached the success they experienced if they had started with honest practices?

References

Brignull, H. (2011). Dark Patterns: Deception vs. Honesty in UI Design. Interaction Design, Usability338.

Task 9: Network Assignment

For this assignment, I preferred using the tables generated by Palladio file. First I found the community I had been sorted into on the Palladio file. Grant, Emily, Elizabeth and I shared four common tracks, which have been highlighted in yellow.

I made this table to help me better visualize and make sense of our choices. Both the least and most selected tracks were picked by everyone within my community: The Well-Tempered Clavier and Flowing Streams.

Due to the popularity of Flowing Streams, I wondered if diversity and representation were common themes among everyone’s lists. I read the Task 8 assignments from the people in my community and some random ones from neighbouring communities, and it seems that many people strived for representation of diverse cultures and identities.

During Task 8, I tried to choose tracks with a diversity of cultures and identities in mind, but I was concerned about how I went about this because

  1. Diverse representation could imply to extra-terrestrials that diversity is widely valued and exists peacefully.
  2. Actually we live in a world where biases are perpetuated in algorithms which allows individuals to live with little exposure to different perspectives (Edwards, 2019).
  3. Choosing tracks from different cultures felt like tokenism since it does not address the underlying issues that exist.
  4. Representation by population ignores the minority groups that do not have the population numbers or cultural clout to be represented.
  5. The data input into algorithms are used to identify current and future trends. How can we be sure that the algorithm will interpret this data so it values diversity?
  6. The impossibility of representing all cultures through 10 tracks makes me wonder if a better  way to represent all humans is being overlooked by the majority, thus ignored by the algorithms. The most advanced algorithms use neural networks that are designed copy the way the human mind works and identify underlying relationships within a data set (Edwards, 2019), which means diversity of thought will be limited by the data set.

I looked at Task 8 completed by people in the community farthest from mine. One thing I noticed was the people in this group all mentioned auditory aesthetics, which I feel, were not so prominent in the Task 8 assignments from people in my community. Next I visited the Task 8 assignment of two people who had not chosen Flowing Streams, Johanna and Danya. Johanna listened to the tracks and used universal human experiences, feelings, to curate her tracks. Danya’s process involved using research and sound to pick tracks that represent the world through history, geographical representation, variety of musical instruments, rhythms and melodies to narrow down her initial choices. Like Johanna, she selected her final ten through listening. A completely different approach than mine, but nevertheless effective. Because this data was not in my immediate community, I doubt I would have seen this in any of my searches or social media feeds. If I were to complete Task 8 again, I would seriously consider approaching the task from human emotions because feelings are something everyone experiences and can relate to. Perhaps it would be easier to achieve diversity through universal feelings?

References

Edwards, J. (2019, August 19). What is predictive analytics? Transforming data into future insights. CIO. https://www.cio.com/article/3273114/what-is-predictive-analytics-transforming-data-into-future-insights.html

Linking Assignment: T8 – Spinning Records

I was impressed with Nick’s list because he broke it down by region and it was easy to see that his goal was to represent as many different musical traditions and cultures as possible. At the same time, I feel curation comes from a place of expertise. I’m not a music expert, but I am an expert on my own experiences. If extra-terrestrials have a genuine interest in understanding humans, they would look through the top 10 lists of not just one person, but many and compare them. The enormity of digital information available and the impossibility to archive them all means the information out there is more than what one human life can consume. To make sure these archives are not lost, information should be presented to create curiosity and encourage further reading, like the opinion pages of a newspaper.

Right now the job of archiving online data and information resides in the hands of a few. What criteria do they use to determine what is valuable enough to archive? Could technology and algorithms help archivists archive more online texts?

Below I’ve posted my comment on Nick’s T8 task.

Hi Nick,

I took a different approach from you. While originally I tried to be objective, I ended up choosing tracks based on my own prior background. For example, originally I wanted to talk about how the opening notes of Beethoven’s Fifth Symphony are described as fate knocking, but a quick Wikipedia search showed me how that might not have been Beethoven’s intention at all, and that it was more likely mimicking a yellow-hammer bird’s song. I was torn between which description to choose, the one most popularly believed or the one described as most likely to be true. Sometimes I find it hard to not be subjective. I think if I were to redo this activity, I would make two Top 10 lists, one based on my own preferences and one where I do my best to be objective to see how much or little overlap there is.

Task 8: Golden Record Curation

When I completed the quiz, I thought I’d chosen the tracks in the mind map, so imagine my surprise when I checked the Palladio file and saw four tracks I didn’t remember choosing: Izlel je Delyo Hagdutin, Melancholy Blues, Kinds of Flowers and Tchakrulo. The only explanation I can think of is sometimes my connection freezes and when it does, the page loads incorrectly so it is partially not aligned. When I complete assignment 9, I will use the data that was inputted, but for this activity, I’m posting what I thought I put into the quiz.

This activity was interesting because this course has primarily dealt with the visual aspect of communicating. Because I think this world is becoming more visual, I couldn’t help but wonder how many viewers would listen to each of these tracks and if they did, how much would they listen? Another thought that came to mind was, what if species from another planet were browsing Earth’s Internet to learn more about us, but they do not have sensors to translate these vibrations and sounds into something meaningful? As Dr Rumsey (2017) noted, written notes need to be included in archives, they can’t be left to be misread. Even without considering extra-terrestrial beings, the digitization of records does not result in absolute accessibility. Just as the proprietary nature that hardware and software are developed can lead to technology becoming obsolete (Smith, 1999), perhaps this record will go the way of Geocities. However, as Dr Rumsey (2017) said, the impossibility of preserving everything does not mean we should not even try. I initially set out to choose tracks that I felt were inclusive and representative of the world’s diversity, but at the end of the activity I found the pieces I chose were personal. While I was researching the different pieces, my prior knowledge and memories influenced my choices. Dr Rumsey (2017) said that in scientific thought, the past is a tool for predicting the future. Knowledge that is handed down from teacher to students and then to the students’ students is the future being created, so while my choices are personal, it is formed from the knowledge handed down by my own teacher and the teachers who taught my teacher.

*To enlarge the mind map: 1. right-click on the image 2. open the image in another tab 3. left-click on the image to zoom in

References

Asia population (2021, October 27). World of Meters. https://www.worldometers.info/world-population/asia-population/

Brown University. (2017, July 11). Abby Smith Rumsey: “Digital Memory: What Can We Afford to Lose?” [Video]. YouTube. https://www.youtube.com/watch?v=FBrahqg9ZMc

Smith, A. (1999). Why digitize? Retrieved June 15, 2019, from Council on Library and Information Resources website: https://www.clir.org/pubs/reports/pub80-smith/pub80-2/

The rite of spring. (2021, October 19). In Wikipedia. https://en.wikipedia.org/wiki/The_Rite_of_Spring#cite_ref-2

Waxman, O.B. (2018, January 25). How ‘here comes the bride’ became the song you hear at every wedding ceremony. Time. https://time.com/5115834/wedding-march-here-comes-the-bride/

Task 7: Mode-bending

Albertine Gaur (1992) suggested that the storage, preservation and distribution of knowledge no longer rely on the “actual process of writing” so the digital era can be said to recall the pre-literate era (cited in Dobson & Willinsky, 2021, p. 5). While I made Task 1 with audio, I do not consider it to be accessible like the oral tradition because the audio is monotonous and long-winded. My goal was to make this task accessible similar to oral stories. I chose to make a digital picture book because I felt this would be a familiar mode for most people. Unlike Task 1, Task 7 is hierarchical because it must be accessed in a certain order to be understood. While I like how Task 1 could be accessed in any order, Mohageg (1992) found that “highly networked nonhierarchical environments challenged participants and produced a negative effect on task performance” (cited in Dobson & Willinsky, 2021, p. 7). I hope by choosing what I consider a common mode of literacy, that will lessen the chance of viewers getting “lost” due to connections made being more relevant to me than others (cited in Dobson & Willinsky, 2021, p. 7). Another concern I have with relevancy is whether this will make sense in the future, so I added cards to relevant videos to my video. Unfortunately, only 5 cards can be inserted into the video, though I did manage to add two extra video links at the end.

In one of my classes, the recommended video length is 1.5 to 5 minutes long, which luckily fit within the limits of the free screen recording app I used. At first, I wanted to put as much of the information from Task 1 into Task 7, but the information felt stale and mostly irrelevant. The New London Group (1996) notes that numerous subcultures and the growing disparity of thought among them have led to an “invasion of private spaces by mass media culture” and the difficulties families and teachers face competing with these messages (p. 70). As a teacher, I would want to know what my students think, and as a creator, I am interested in hearing what questions and connections my viewers make more than I am in talking about myself. Perhaps if people listened more and talked less, a greater understanding and acceptance of ideas could be reached.

References

Dobson, T. and Willinsky, J. (2021). “Digital Literacy” in The Cambridge Handbook of Literacy. Unpublished manuscript.

The New London Group. (1996). A pedagogy of multiliteracies: Designing social futures. Harvard Educational Review 66(1), 60-92.

Spam prevention powered by Akismet