Skip navigation
Please use this identifier to cite or link to this item: http://arks.princeton.edu/ark:/88435/dsp01r207ts492
Title: Uncovering the Structure of Animal Behavior via Deep Learning
Authors: Pereira, Talmo
Advisors: MurthyShaevitz, MalaJoshua W
Contributors: Neuroscience Department
Keywords: animal behavior
computer vision
deep learning
ethology
pose estimation
tracking
Subjects: Neurosciences
Computer science
Issue Date: 2021
Publisher: Princeton, NJ : Princeton University
Abstract: Understanding how the brain generates behavior is a core goal of neuroscience. The need for tools to quantify naturalistic, freely-moving animal behavior has given rise to the nascent field of computational ethology. The work presented in this thesis describes the key contributions we have made to this field in the form of novel computational approaches for quantifying animal behavior. Natural behavior can be quantified at varying degrees of detail, ranging from coarse center-of-mass positional tracking to detailed motion capture of individual body parts. Until recently, it was not possible to perform full body motion capture in freely behaving animals without physical markers, specialized hardware or other experimental constraints. To address this, we developed LEAP (LEAP Estimates Animal Pose) a system for unconstrained markerless motion capture. Inspired by emerging deep learning-based approaches for human pose estimation, LEAP uses neural networks to predict body landmark locations from raw video frames. A core innovation in LEAP was to leverage lightweight neural networks to quickly specialize on new datasets with very little labeled examples which can be iteratively improved through human-in-the-loop training. LEAP was highly effective at tracking animals from flies and mice to fish and giraffes, but it was not designed for tracking multiple animals simultaneously. To address this, we developed its successor, SLEAP (Social LEAP) that explicitly models the problem of tracking multiple poses when animals are closely interacting. SLEAP was implemented from the ground up as a deep learning framework with infrastructure enabling custom network architectures, and multiple approaches to detection, grouping and tracking. We showed that SLEAP outperforms existing methods by 1-2 orders of magnitude in both accuracy and speed, enabling realtime multi-animal pose tracking which we demonstrate by implementing closed-loop optogenetic control of social behaviors. Finally, we apply these methods by developing a high-resolution behavioral monitoring setup to probe the structure of fly courtship behavior. We used SLEAP to track poses of freely interacting pairs of males and females while recording courtship song. Through experimental manipulations and computational modeling of the female response to male song, we found evidence for specific neural circuit mechanisms for multisensory integration across timescales.
URI: http://arks.princeton.edu/ark:/88435/dsp01r207ts492
Alternate format: The Mudd Manuscript Library retains one bound copy of each dissertation. Search for these copies in the library's main catalog: catalog.princeton.edu
Type of Material: Academic dissertations (Ph.D.)
Language: en
Appears in Collections:Neuroscience

Files in This Item:
This content is embargoed until 2024-01-25. For questions about theses and dissertations, please contact the Mudd Manuscript Library. For questions about research datasets, as well as other inquiries, please contact the DataSpace curators.


Items in Dataspace are protected by copyright, with all rights reserved, unless otherwise indicated.