For this post I linked to Georgia’s Detain/Release task.

https://blogs.ubc.ca/gkadawong/2022/07/28/task-11-detain-release/?unapproved=7&moderation-hash=e86611efee525939701eb0eec0a4a920#comment-7

Not enough. That’s what went through my head when the slides came up with the detainee’s information. The prosecution always recommended to detain and there was no statement given as to how they came up with the levels of low, medium or high. It was hard to make life affecting decisions with so little information. In McRaney’s (n.d) podcast he talks about how machines can’t tell if a bias from a generation ago was morally good or neutral, nor can they tell if it was unjust, based on arbitrary social norms that lead to exclusion. So, is that what happens to these people? Cards were arbitrarily filled out based on previous bias’s of about a group of people. I know it streamlines the process, but does it actually affectively make decisions to the betterment of the individual and others whose been affected by the detainee. Mars (2018) podcast says that the goal of an algorithm is to replace subjective judgments with objective measurements, remove the personal and go on facts. But when those ideas around fact were tainted with personal feelings from past practice then they are no longer subjective.

These postcards of information just did not have the information one would need to make informed decisions affecting lives of human beings. The question is can machines do it? Dr O’Neil (2016) believes it is possible, and that more and more companies and programs are wanting to correct mistakes their algorithms have made, which she sees as a huge positive step into machines making better choices.

References
McRaney, D. (n.d.). Machine Bias (rebroadcast). In You Are Not so Smart. Retrieved from https://soundcloud.com/youarenotsosmart/140-machine-bias-rebroadcast

Mars, R. (Host) The Age of the Algorithm. (2018). In 99 Percent Invisible. Retrieved from https://99percentinvisible.org/episode/the-age-of-the-algorithm/

O’Neil, C. (2016). Weapons of math destruction: How big data increases inequality and threatens democracy (First edition). New York: Crown.