Skip navigation
Please use this identifier to cite or link to this item: http://arks.princeton.edu/ark:/88435/dsp01r207ts492
Full metadata record
DC FieldValueLanguage
dc.contributor.advisorMurthyShaevitz, MalaJoshua W
dc.contributor.authorPereira, Talmo
dc.contributor.otherNeuroscience Department
dc.date.accessioned2022-02-11T21:31:07Z-
dc.date.available2024-01-25T13:00:06Z-
dc.date.created2021-01-01
dc.date.issued2021
dc.identifier.urihttp://arks.princeton.edu/ark:/88435/dsp01r207ts492-
dc.description.abstractUnderstanding how the brain generates behavior is a core goal of neuroscience. The need for tools to quantify naturalistic, freely-moving animal behavior has given rise to the nascent field of computational ethology. The work presented in this thesis describes the key contributions we have made to this field in the form of novel computational approaches for quantifying animal behavior. Natural behavior can be quantified at varying degrees of detail, ranging from coarse center-of-mass positional tracking to detailed motion capture of individual body parts. Until recently, it was not possible to perform full body motion capture in freely behaving animals without physical markers, specialized hardware or other experimental constraints. To address this, we developed LEAP (LEAP Estimates Animal Pose) a system for unconstrained markerless motion capture. Inspired by emerging deep learning-based approaches for human pose estimation, LEAP uses neural networks to predict body landmark locations from raw video frames. A core innovation in LEAP was to leverage lightweight neural networks to quickly specialize on new datasets with very little labeled examples which can be iteratively improved through human-in-the-loop training. LEAP was highly effective at tracking animals from flies and mice to fish and giraffes, but it was not designed for tracking multiple animals simultaneously. To address this, we developed its successor, SLEAP (Social LEAP) that explicitly models the problem of tracking multiple poses when animals are closely interacting. SLEAP was implemented from the ground up as a deep learning framework with infrastructure enabling custom network architectures, and multiple approaches to detection, grouping and tracking. We showed that SLEAP outperforms existing methods by 1-2 orders of magnitude in both accuracy and speed, enabling realtime multi-animal pose tracking which we demonstrate by implementing closed-loop optogenetic control of social behaviors. Finally, we apply these methods by developing a high-resolution behavioral monitoring setup to probe the structure of fly courtship behavior. We used SLEAP to track poses of freely interacting pairs of males and females while recording courtship song. Through experimental manipulations and computational modeling of the female response to male song, we found evidence for specific neural circuit mechanisms for multisensory integration across timescales.
dc.format.mimetypeapplication/pdf
dc.language.isoen
dc.publisherPrinceton, NJ : Princeton University
dc.relation.isformatofThe Mudd Manuscript Library retains one bound copy of each dissertation. Search for these copies in the library's main catalog: <a href=http://catalog.princeton.edu>catalog.princeton.edu</a>
dc.subjectanimal behavior
dc.subjectcomputer vision
dc.subjectdeep learning
dc.subjectethology
dc.subjectpose estimation
dc.subjecttracking
dc.subject.classificationNeurosciences
dc.subject.classificationComputer science
dc.titleUncovering the Structure of Animal Behavior via Deep Learning
dc.typeAcademic dissertations (Ph.D.)
pu.embargo.terms2024-01-25
pu.date.classyear2021
pu.departmentNeuroscience
Appears in Collections:Neuroscience

Files in This Item:
File Description SizeFormat 
Pereira_princeton_0181D_13934.pdf39.23 MBAdobe PDFView/Download


Items in Dataspace are protected by copyright, with all rights reserved, unless otherwise indicated.