Skip navigation
Please use this identifier to cite or link to this item: http://arks.princeton.edu/ark:/88435/dsp01f1881q09x
Title: Application of Machine Vision to Detect Nuclear Labeled Cells in Full Mouse Brain Microscopy Data for the Study of Conditioned Taste Aversion
Authors: Frawley, Timothy
Advisors: Witten, Ilana
Department: Psychology
Class Year: 2022
Abstract: The purpose of this computational thesis was to compare two data pipelines for detecting cells in mouse brain microscopy data. The pipeline in question, Cellfinder, adds a machine vision step to classify cell candidates as either cells or artifacts. Cellfinder’s performance was compared to another pipeline Clearmap which used filtering and thresholding but no neural network classification step. Additionally, this thesis sought to evaluate Cellfinder’s classification for a larger data set with several notable differences compared to the data used in the original Cellfinder study. The data analysis step was part of a larger study on conditioned taste aversion in mice and the difference in the phenomenon between novel and familiar flavors. Several brain regions of interest, such as the Piriform area and Claustrum, had high cell counts and densities across all methods. However, the exact cell count numbers differed between Cellfinder and Clearmap. Training the Cellfinder network led to a large decrease in cell counts for all regions, likely due to a reduction in false positives. There remains room for improving the accuracy of Cellfinder, as training was only able to achieve a validation accuracy of 84%. Cellfinder also performed slowly on the Princeton computing cluster. Keywords: Machine Learning, Conditioned Taste Aversion, Cell Detection
URI: http://arks.princeton.edu/ark:/88435/dsp01f1881q09x
Type of Material: Princeton University Senior Theses
Language: en
Appears in Collections:Psychology, 1930-2023

Files in This Item:
File Description SizeFormat 
FRAWLEY-TIMOTHY-THESIS.pdf1.63 MBAdobe PDF    Request a copy


Items in Dataspace are protected by copyright, with all rights reserved, unless otherwise indicated.