Task 11: Detain/Release

Standard

I was nervous to do the Detain/Release simulation as I knew that it would invoke some feelings in me. I also knew that I would have a hard time deciding which people I would detain and which people I would release. I know that personally, I lean more on the release side especially when reading about cases in the news or elsewhere.

Some of the personal biases I have:

  • I am a woman of colour
  • I have close family members who have been in jail
  • I have read a lot of crime novels
  • I have watched a lot of crime television shows (true crime and fictional)
  • I stay informed with the news

These are just some of the criteria I used when deciding if I would release an individual or not:

  • The crime they were charged with
  • Their explanation or reasoning as to why they should be released
  • Their level of violence (high, medium, low risk)

When making these decisions I thought about the crime that the person was charged with first. For me, drug possession ranks pretty low on my list of crimes so I would often release the individual unless they had a medium or high risk for violence, in which case I would question that decision. I also looked at the individual’s explanations. Some individuals with a smaller level offence discussed how they would be unable to feed their families if they were detained. This also effected my decision.

When reading the article by O’Neil (2017) on algorithms used in the police force to detect and predict areas of crime the question was posed, “whether we as a society are willing to sacrifice a bit of efficiency in the interest of fairness.” The fact that this is a question makes me a bit frustrated. Fairness should never be traded off for efficiency, especially in matters that are of legal importance and hold significant weight in people’s lives. Algorithms like “PredPol, even with the best of intentions, empowers police departments to zero in on the poor, stopping more of them, arresting a portion of those, and sending a subgroup to prison” (O’Neil, 2017). In the case of the Detain/Release simulation, many of the individuals who were arrested on crimes like drug possession were poor and expressed that they would not be able to feed their families if they were to be detained. Are these types of software only working to target the poor who are involved in drug possession crimes and leaving out the wealthy who are just as equally involved in these types of crimes. The simulation also used blurred out cartoon faces to represent those charged with crimes but race and gender were clearly visible and I wonder if those factors would also effect someone’s decision making process as well.

References

O’Neil, C. (2017, April 6). Justice in the age of big data. Retrieved from: ideas.ted.com website: https://ideas.ted.com/justice-in-the-age-of-big-data/

Porcaro, K. (2019, April 17). Detain/release: Simulating Algorithmic Risk Assessments at pretrial. Medium. Retrieved November 21, 2021, from https://medium.com/berkman-klein-center/detain-release-simulating-algorithmic-risk-assessments-at-pretrial-375270657819.

Leave a Reply

Your email address will not be published. Required fields are marked *