Skip navigation
Please use this identifier to cite or link to this item: http://arks.princeton.edu/ark:/88435/dsp015425kd804
Title: Machine (Un)learning: An Investigation of Racial Bias in Predictive Recidivism Algorithms as a Product of Real-World, Structural Discrimination
Authors: Lewandowski, Carina
Advisors: Engelhardt, Barbara
Department: Computer Science
Class Year: 2021
Abstract: The Correctional Offender Management Profiling for Alternative Sanctions (COMPAS) software is one example of machine learning technology that is capable of translating real-world bias to algorithmic bias, using historical, environmental, and behavioral data about individuals who have previously been arrested to predict their risk of recidivism. Because of the way racism, social and economic exclusion, and implicit biases are attached to policing and criminal justice systems, the COMPAS tool has been found to perform disparately on Black defendants. To date, much of the research working to understand bias in the COMPAS tool has focused on algorithmic evaluations and top-level interventions. Bias, however, runs more deeply than the predictive model and its outputs — algorithmic unfairness is merely a repercussion of real-world discrimination. The present research investigates how bias in algorithms for predictive recidivism reflect structural inequalities, first by building on existing investigations of fairness in the COMPAS tool, and second by employing topic modeling techniques on textual charge descriptions to uncover racially-mediated patterns in crime type and crime movement that might demonstrate the effects base-layer bias. On the first point, this work finds that algorithmic bias in the predictive tool is reproducible. On the second point, while no significant difference in overall crime distribution was found across racial groups, this work does provide statistical evidence to suggest that Black individuals are disproportionately rearrested for certain types of crime. This finding has the potential to provide quantitative backing for literature about the unfair and targeted surveillance of Black communities, but also offers a reason to consider how algorithmic assessments alone might be insufficient for determining if tools like COMPAS are fit for criminal justice decision-making.
URI: http://arks.princeton.edu/ark:/88435/dsp015425kd804
Type of Material: Princeton University Senior Theses
Language: en
Appears in Collections:Computer Science, 1987-2023

Files in This Item:
File Description SizeFormat 
LEWANDOWSKI-CARINA-THESIS.pdf1.37 MBAdobe PDF    Request a copy


Items in Dataspace are protected by copyright, with all rights reserved, unless otherwise indicated.