Skip navigation
Please use this identifier to cite or link to this item: http://arks.princeton.edu/ark:/88435/dsp01kh04ds807
Title: Diagnosing Racial Bias in Machine Learning: An Examination of Fairness in Medical Appointment Scheduling Optimization Models
Authors: Wang, Eileen
Advisors: Vanderbei, Robert
Department: Operations Research and Financial Engineering
Certificate Program: Applications of Computing Program
Center for Statistics and Machine Learning
Class Year: 2021
Abstract: A growing number of cases have proven that machine learning algorithms demonstrate racial and gender biases, oftentimes reflecting deeply entrenched societal prejudices. This is particularly worrisome in the context of the healthcare industry, where unfair decisions can cost lives. Machine learning technologies have a growing impact on this sector through many applications, ranging from image recognition for disease detection to precision medicine based on genomic data. However, machine learning can play a critical role on the patient’s experience even before they step into the clinic; appointments can now be scheduled by machines rather than humans in ways that optimize clinic performance, as measured by the patient and provider wait times. More specifically, algorithms predict the patients with the greatest no-show risk and then place such patients in overbooked appointment slots. While this strategy increases efficiency and profitability for healthcare providers, it has been reported that such scheduling algorithms unfairly treat different demographic groups, namely causing Black patients to wait about 30% longer than non-Black patients at clinics. This paper breaks down the problem into two distinct models: a no-show prediction model and an optimal appointment scheduling model. The no-show prediction model consists of training machine learning algorithms to predict whether a patient will show up for their appointment; data bias is examined at this step, since minority groups are oftentimes underrepresented or misrepresented in the datasets. For the optimal appointment scheduling model, the focus is on detecting algorithmic bias, as properties of the scheduling algorithms may unfairly favor certain demographic groups. In addition to identifying biases at both of these steps, the paper will seek to present a cohesive solution involving adjustments to both models, ideally so that clinic efficiency remains high, while wait times are more equitable across racial demographics.
URI: http://arks.princeton.edu/ark:/88435/dsp01kh04ds807
Type of Material: Princeton University Senior Theses
Language: en
Appears in Collections:Operations Research and Financial Engineering, 2000-2023

Files in This Item:
File Description SizeFormat 
WANG-EILEEN-THESIS.pdf1.83 MBAdobe PDF    Request a copy


Items in Dataspace are protected by copyright, with all rights reserved, unless otherwise indicated.