Task #11: Detain and Release Algorithms

Module 11 focused on algorithms; a process or set of rules to be followed in calculations or problem solving operations, usually done by a computer (Wikipedia, 2022). This module examined the influence of human biases in the creation of algorithms, particularly in policing. 

The task we were challenged with this week was completing a simulation called Detain and Release. In this simulation, the user acts as a judge using a risk assessment tool to determine whether to detain or release the defendant until trial. The risk assessment tool in this simulation was created using data from the U.S Census and Bureau of Justice Statistics to generate defendants and alleged offences (Porcaro, 2019). The information provided in the risk assessment tool was the probability that the defendant would fail to report to trial, would recommit a crime and their propensity for violence. The simulation limited the user’s space in jail and provided feedback to how the fictional community was feeling based on their decisions.

Porcaro (2019) states: “Software has framing power: the mere presence of a risk assessment tool can reframe a judge’s decision-making process and induce new biases, regardless of the tool’s quality” (pg. 1).  I would agree with this statement; I did not question the risk assessment tool’s quality. I made decisions based on my own personal biases and in fact felt more confident and comfortable with my decisions as the simulation progressed. 

As I reflect on this task I recognize a few biases and/or thought processes that were present as I made each decision. 

First, violence was a major factor. If the probability for violence was reported as high, I immediately detained them. The one defendant who was being charged with rape, I clicked detain the fastest. Does this have to do with my gender and/or size? Would a large male make the same decisions I did in these situations? 

Second, as a person of color, I am very aware of the statistics in North America of the disproportionate amount of BIPOC people in jail. In Canada in 2018, 28 percent of the incarcerated people were Indigenous, even though Indigenous people represent only 4.1 percent of the overall Canadian population (Department of Justice Canada 2018). The true crime podcast really highlighted this fact; with stories of targeted summons, particularly for young black men (Vogt, n.d). Consciously I tried not to look at the skin color of the blurred image, but when I did I believe that I overcompensated for systematic racism and was less likely to detain a BIPOC defendant. 

Finally, I was not very concerned about the flight risk of the defendants (unless they were violent). I was more concerned about the defendants not being able to keep their jobs, families or homes. This was evident when I finished the simulation with more public fear than people in jail. 

I am incredibly interested in reviewing our class results from this simulation. It was shockingly easy to make decisions based on randomly generated data in a risk assessment tool. Until we see how our decisions affect real people and have real consequences, it is hard to humanize it. This task allowed me to examine my biases in the justice system, but more importantly it reminded me that algorithms are not neutral, that we have to question their biases and remember that there are real-life consequences based on algorithms. 

  

References: 

Algorithms. (2022, July 17). In Wikipedia. https://en.wikipedia.org/wiki/Algorithm

Department of Justice Canada. (2018). “Indigenous overrepresentation in provincial/territorial corrections.” Just Facts. Research and Statistics Division. https://www.justice.gc.ca/eng/rp-pr/jr/jf-pf/2018/docs/nov01.pdf

Porcaro, K. (2019, January 8). Medium https://medium.com/berkman-klein-center/detain-release-simulating-algorithmic-risk-assessments-at-pretrial-375270657819

Vogt, P. (n.d.). The Crime Machine, Part II. In Reply All

 

Leave a Reply