Skip navigation
Please use this identifier to cite or link to this item:
Title: Algorithms for Data Normalization with Applications to Stop and Frisk
Authors: Fillmore, Mark
Advisors: Walker, David
Department: Computer Science
Class Year: 2015
Abstract: Data modeling is an already difficult task that is further exacerbated by the errors of data entry. Inconsistencies in large quantities of data can make it difficult to perform any kind of automated analyses. We motivate our investigation into improved data cleaning methods by revealing disastrous non-uniformity in data related to the controversial Stop and Frisk policy as implemented by the NYPD. These inconsistencies help guide our construction of workflow F, which consults multiple similarity measurements in order to dictate proper transformations of non-uniform data into standardized values. F increases the volume of non-standardized data that is correctly transformed by 887% in comparison to common existing methods, such as the Levenshtein distance. We conclude by presenting additional pathways for improvement and describing how to most effectively apply workflow F as part of an interactive tool.
Extent: 44 pages
Type of Material: Princeton University Senior Theses
Language: en_US
Appears in Collections:Computer Science, 1988-2017

Files in This Item:
File SizeFormat 
PUTheses2015-Fillmore_Mark.pdf858.51 kBAdobe PDF    Request a copy

Items in Dataspace are protected by copyright, with all rights reserved, unless otherwise indicated.