Skip navigation
Please use this identifier to cite or link to this item: http://arks.princeton.edu/ark:/88435/dsp01pz50h0025
Full metadata record
DC FieldValueLanguage
dc.contributor.advisorPillow, Jonathan W.-
dc.contributor.advisorNorman, Kenneth Andrew-
dc.contributor.authorwu, anqi-
dc.contributor.otherNeuroscience Department-
dc.date.accessioned2020-07-13T02:19:06Z-
dc.date.available2020-07-13T02:19:06Z-
dc.date.issued2019-
dc.identifier.urihttp://arks.princeton.edu/ark:/88435/dsp01pz50h0025-
dc.description.abstractMany studies in neuroscience posit that large-scale neural activity reflects noisy high-dimensional observations of some underlying, low-dimensional signals of interest. One approach to the problem of identifying such signal is to develop latent variable models that formalize the relationship between low-dimensional signals and high-dimensional measurements of neural activity. Low-dimensional structures we extract can help shed light on how information is encoded at the population level, and provide significant scientific insights into the brain and human behavior. In recent years, there has been rapid development in recording techniques which enables us to have a large amount of neural data to analyze. With these data, we can develop inferential and statistical techniques so as to understand and interpret the latent structures underlying high-dimensional neural data. In this thesis, we will develop Bayesian latent variable models for five different neural tasks. In particular, we focus on Gaussian process latent variable models, which provide a flexible and interpretable way of modeling the mapping from the latent to the observed neural data as well as modeling the latent structures. Gaussian process is powerful in imposing smooth assumptions over functions which we will employ in different scenarios. However, due to the complexity of Gaussian process based modeling, we also need to develop efficient and scalable inference methods for fitting these models to data. We propose decoupled Laplace approximation and block coordinate descent for Gaussian process latent variable models and a moment-based efficient estimator for quadratic convolutional subunit models. The discovered latent structures provide scientific insights into neural behaviors so as to facilitate greater understanding of the brain.-
dc.language.isoen-
dc.publisherPrinceton, NJ : Princeton University-
dc.relation.isformatofThe Mudd Manuscript Library retains one bound copy of each dissertation. Search for these copies in the library's main catalog: <a href=http://catalog.princeton.edu> catalog.princeton.edu </a>-
dc.subjectBayesian probabilistic modeling-
dc.subjectbrain analysis-
dc.subjectGaussian process-
dc.subjectlatent variable model-
dc.subjectneural recording-
dc.subjectstatistical model-
dc.subject.classificationNeurosciences-
dc.subject.classificationComputer science-
dc.subject.classificationArtificial intelligence-
dc.titleBayesian latent structure discovery for large-scale neural recordings-
dc.typeAcademic dissertations (Ph.D.)-
Appears in Collections:Neuroscience

Files in This Item:
File Description SizeFormat 
wu_princeton_0181D_13116.pdf27.68 MBAdobe PDFView/Download


Items in Dataspace are protected by copyright, with all rights reserved, unless otherwise indicated.