IP 3 Algorithms

IP: 3 Algorithms 

“At a time when state funding for public goods such as universities, schools, libraries, archives, and other important memory institutions is in decline in the US, private corporations are providing products, services and financing on their behalf. With these trade-offs comes an exercising of greater control over the information, which is deeply consequential for those already systematically oppressed…” (Noble, p. 123) 

 Explain in your own words what “content prioritization” (Noble, p. 156) means (give some examples) and how (in lay terms) content prioritization algorithms work.  

Content prioritization is giving preference, highlighting, or showing certain types of digital content before others. These prioritization algorithms follow rules and organized patterns that determine what you see. These patterns are based off relevance to you, your previous searches and online behaviour, and what is currently popular. For example, if you often search articles about sports, open news about sports, like posts about sports, the algorithm will learn this behaviour and then prioritize sports-related content for you. Another example is Tik Tok. Depending on how long you spend watching a video, Tik Tok will determine your interest level in the post and continue to prioritize similar posts. Other examples can be seen on your social media feeds, your news apps, even my Gmail prioritizes certain mail and files what is considered lowest priority into my junk mail. Many of these algorithms can seem positive by increasing efficiency and steering you towards areas of interest, however they also result in people seeing more of the same, causing ‘filter bubbles’. Merriam webster defines these as “an online environment in which people are exposed only to opinions and information that conform to their existing beliefs (Definition of FILTER BUBBLE, 2023). This has many consequences for the user and others in society, particularly those often marginalized. 

With control over the “largest digital repository in the world” (Noble, p. 157), how have Google’s content prioritization algorithms been “consequential for those already systematically oppressed”? How do they impact your professional life? (give specific examples and briefly discuss) 

Noble highlights Google as an example of a private institution that has massive control since declining state memory institutions. If a corporation, such as Google prioritizes specific content, it goes against the principles of net neutrality. In a net-neutral environment, content should not be chosen based off of what it is or who is viewing it. Without this neutrality, there is no longer an even playing field for all content and viewers. Instead, there is the opportunity for discrimination and bias. 

For example, “Google focused on its prioritization of high-paying advertisers that were competing against small businesses and entities that do not index pages on the basis of the pay-per-click advertising model” (Noble, 2018, pg 158). Small companies that are already facing obstacles are therefore further disadvantaged. Prioritization algorithms are also based on previous data. Therefore, if historical data includes stereotypes or bias towards certain people, then this could be perpetuated. For example, if you were to search ‘doctor’ or ‘engineer’ and saw only pictures of males you may be influenced to believe that they are male dominated professions. If you were seeking to learn about a certain culture, your online searches may only yield stereotypical pictures and information furthering an inaccurate representation. 

These algorithms impact my professional life as a teacher in many ways. They can make it harder to find resources that are diverse. This is especially hard as a Social Studies teacher, trying to stay up to date with current events. Algorithms make it much more difficult to ensure I am getting an unbiased understanding of what is happening across the world. Additionally, these algorithms impact curriculum and lesson planning. Administration and I must implement this learning into curriculum, so students understand what they see online is not the whole picture teaching skills such as triangulation and fact checking to combat their ‘filter bubbles’ and misinformation. 

PageRank is essentially a popularity contest for websites. It looks at how many other websites link to a particular site and considers those links as votes. The more and higher quality votes a site gets, the more important Google thinks it is, and it shows up higher in search results. This impacts my personal life in many ways. I rely on research and information to form many decisions in my life. For example, my finances. I search for information online about investing, saving, mortgages etc. So, PageRank would show me certain content perhaps because large corporations have paid to have the highest rank to seem the most reputable. This ranking system can be seen very clearly in social media, something I use daily. If a post on Instagram or TikTok gets a high number of likes, it will get higher visibility on my feed. However, just because something is popular does not make it accurate or credible. Certain posts act as click bait and contain misleading information. Also, popular and wealthy people such as celebrities or content creators continue to get more popular instead of providing equal opportunity to those with less of a platform but valuable things/ideas to share. Therefore, I can impact PageRank by being conscious about what I click on and share and ‘liking’ diverse posts. 

 Definition of FILTER BUBBLE. (2023, December 7). https://www.merriam-webster.com/dictionary/filter+bubble 

Noble, S. U. (2018). Algorithms of oppression: How search engines reinforce racism. New York University Press. https://doi.org/10.18574/9781479833641 

 

 

 

Leave a Reply

Your email address will not be published. Required fields are marked *