The main takeaway for me from this week’s readings and assignment was how “AI-informed” decisions are not as objective, fair and mathematically sound as they seem, and how dangerous it can be to rely on it to make important decisions such as in policing, paroles, approval for a loan, hiring, admission, etc. without realizing how these machines were trained in the first place. Not only are these mathematical models usually hidden and inaccessible, they are programmed in a way that most people cannot understand. As Dr. O’Neil mentioned in her Google Talk, there is bias in the data you train your algorithm on and your definition of success imposes its agenda into the algorithm. Such algorithm training, under the guise of mathematics and technology, regurgitates the inherent bias that it was trained on, leading to further inequity and widening of the societal gap. This task reminded me of this game called ‘Survival of the Best Fit’ (https://www.survivalofthebestfit.com) which addresses machine bias in hiring practice. You would think that a machine would be less discriminatory compared to a human interviewer who comes with their own bias and preference based on their experience. Aren’t these automated decision-making supposed to make decisions objectively based on facts only, without discriminating against race, gender, age, etc.? Wrong. As predictive policing has shown, the data it gets fed and the consequent feedback loop comes with bias that has been passed on. But now under the guise of technology and mathematics that makes it even harder to detect and address. In the Detain/Release task, I couldn’t help but be heavily influenced by the predictive risk rating. Every time I saw a red colour, especially under violence, I clicked Detain. Halfway through, I thought I would change my strategy and detain/release based on the crime they have committed without looking at the predictive information. But it is hard to flat-out ignore these “AI-informed” suggestions and I can see how decisions may be swayed based on what “data” suggests.