Skip navigation
Please use this identifier to cite or link to this item: http://arks.princeton.edu/ark:/88435/dsp01z890rx45k
Title: Algorithmic (In)Justice: The Bias and Unfairness of Risk Assessment Instruments Used in Sentencing
Authors: Bernius, Jeremy
Advisors: Macedo, Stephen
Department: Princeton School of Public and International Affairs
Certificate Program: Program in Technology & Society, Technology Track
Class Year: 2022
Abstract: Technology and algorithms pervade every aspect of modern society, and the field of criminal justice is no different in this regard. Currently, algorithms are used to help predict when and where crime will occur, if a offender will reappear for their court date, how much supervision someone on parole or probation requires, and how the criminogenic and social needs an offender has should be addressed. In recent years, algorithms have emerged in sentencing decisions as well, providing risk assessments for judges to consult when considering the length and type of punishment an offender shall receive. This technology, an algorithmic risk assessment instrument (RAI), has garnered a lot of attention in popular articles and academic literature, especially since a 2016 ProPublica investigation revealed some of the ways it produces and reinforces racial bias in the criminal justice system. Since then, many different people across various disciplines have argued both in favor and against various aspects of RAIs, such as their design and implementation, as well as the role they play in sentencing. As this discourse is fairly recent, there are few settled facts and positions on this issue. This paper strives to contribute to this discourse by forging a holistic, interdisciplinary, empirical approach to understanding and outlining the contours, impacts, and future of using algorithmic risk assessment instruments in sentencing. Specifically, I seek to answer two unsettled questions in this conversation: as they are currently designed and implemented, do algorithmic risk assessment instruments foster racial bias and drive racial disparities in sentencing decisions? If so, what does this imply for policies on the use of algorithmic risk assessment instruments in sentencing? I have adopted a qualitative approach supported by quantitative findings in conducting my research and writing this paper. I draw on government reports, scientific studies, representative surveys, legal analyses, historical accounts, and reported interviews throughout each chapter. My first substantive chapter (the second chapter) houses my version of a literature review by investigating the historical background, current practice, and emerging criticism of risk assessments. My third chapter serves as a case study of Virginia and the Nonviolent Risk Assessment, its state-developed instrument. My fourth chapter leans more technical and mathematical, explaining the statistics behind many validation studies and algorithmic mechanisms, before bringing a philosophical perspective on fairness. My fifth chapter uses legal theory to outline permissible and constitutional designs. Lastly, I share my conclusions and explain the policy implications of this paper in my sixth and final chapter. Ultimately, this paper finds that algorithmic risk assessment instruments as they are currently designed and implemented do foster racial bias and drive racial disparities in sentencing decisions. With this in mind, I believe that jurisdictions’ approaches to using algorithmic risk assessments in sentencing are flawed and might be improved in a few ways. While I do not include any substantive policy recommendations, my research’s implications for policy are that jurisdictions using RAIs in sentencing should be conducting frequent validation studies using the population that will be subject to risk assessments, analyzing the impacts and limitations of RAIs in sentencing, investigating errors and identifying who they might affect, and considering the restraints of fairness and constitutionality when selecting data to be used in algorithmic calculations.
URI: http://arks.princeton.edu/ark:/88435/dsp01z890rx45k
Type of Material: Princeton University Senior Theses
Language: en
Appears in Collections:Princeton School of Public and International Affairs, 1929-2023

Files in This Item:
File Description SizeFormat 
BERNIUS-JEREMY-THESIS.pdf1.53 MBAdobe PDF    Request a copy


Items in Dataspace are protected by copyright, with all rights reserved, unless otherwise indicated.