Task 11 – Detain/Release

As a true crime documentary enthusiast, it is no surprise to me that I was heavily engaged in this task. The resources to go along with this task particularly piqued my interest and the task itself left me mind blown and questioning my own thinking while also leaving me with an increased sense of both discomfort and awareness. I particularly enjoyed listening to the podcasts that provided insightful information into real life examples of how predictive software works. 

While completing this task I found myself feeling conflicted with many of my decisions as I did not feel I had enough information to make a just decision. I also noticed that the statement for prosecution rarely recommended release and when it did I typically agreed with its response. However, the majority of the scenarios recommended detention which left me feeling even more conflicted with many of my decisions. The criteria I had to go off of was minimal for a decision that carries such heavy results and consequences. This task opened my eyes in many ways to the unjust decisions that are made every day by large corporations that the public rely on and speaks to how much we rely on technology and its systems to solve our problems. 

It is evident that everyone who uses the internet is affected by algorithms and upon understanding and thinking about algorithms in its simplest forms as, “a set of instructions for which to solve problems” it is particularly terrifying to think about in regards to topics of legal issues and crime (Malan, 2013). The ways in which crime prediction software is carried out was both surprising to hear and not in many ways and also reminded me of the constant ways people are rated and categorized, including with credit scores, insurance premiums, etc. For instance, the location of where your vehicle is parked overnight will affect your insurance premium. Similarly, this predictive model focuses on the geography instead of the individual and these measurement tools do not capture the ‘full picture’ of individuals (O’Neil, 2017). In addition, hearing about the different quota systems in place and strategies to achieve quotas was rather disturbing and reminded me of how my friends and I used to joke about being extra careful on the road towards the end of the month when cops were needing to make their monthly ticket quotas. 

 

O’Neil, C. (2017, April 6). Justice in the age of big data. Retrieved June 18, 2019, from ideas.ted.com website: https://ideas.ted.com/justice-in-the-age-of-big-data/

TED-Ed. (2013). What’s an algorithm? – David J. Malan

Vogt, P. (n.d.-a). The Crime Machine, Part I

Vogt, P. (n.d.-b). The Crime Machine, Part II

One comment

  1. Not enough. That’s what went through my head when the slides came up with the detainee’s information. The prosecution always recommended to detain and there was no statement given as to how they came up with the levels of low, medium or high. It was hard to make life affecting decisions with so little information. In McRaney’s (n.d) podcast he talks about how machines can’t tell if a bias from a generation ago was morally good or neutral, nor can they tell if it was unjust, based on arbitrary social norms that lead to exclusion. So, is that what happens to these people? Cards were arbitrarily filled out based on previous bias’s of about a group of people. I know it streamlines the process, but does it actually affectively make decisions to the betterment of the individual and others whose been affected by the detainee. Mars (2018) podcast says that the goal of an algorithm is to replace subjective judgments with objective measurements, remove the personal and go on facts. But when those ideas around fact were tainted with personal feelings from past practice then they are no longer subjective.

    These postcards of information just did not have the information one would need to make informed decisions affecting lives of human beings. The question is can machines do it? Dr O’Neil (2016) believes it is possible, and that more and more companies and programs are wanting to correct mistakes their algorithms have made, which she sees as a huge positive step into machines making better choices.

    References
    McRaney, D. (n.d.). Machine Bias (rebroadcast). In You Are Not so Smart. Retrieved from https://soundcloud.com/youarenotsosmart/140-machine-bias-rebroadcast

    Mars, R. (Host) The Age of the Algorithm. (2018). In 99 Percent Invisible. Retrieved from https://99percentinvisible.org/episode/the-age-of-the-algorithm/

    O’Neil, C. (2016). Weapons of math destruction: How big data increases inequality and threatens democracy (First edition). New York: Crown.

Leave a Reply

Your email address will not be published. Required fields are marked *