Please use this identifier to cite or link to this item:
http://arks.princeton.edu/ark:/88435/dsp01d791sk48p
Title: | Bayesian Filtering for Neural Networks |
Authors: | Bencomo, Gianluca Michele |
Advisors: | Griffiths, Thomas L |
Department: | Computer Science |
Class Year: | 2023 |
Publisher: | Princeton, NJ : Princeton University |
Abstract: | Adaptability is a crucial component of intelligent systems. In dynamic environments, biological organisms continuously adjust their behavior to match the demands of the current situation. However, this fundamental feature of biological intelligence conflicts with a typical assumption of machine learning, which assumes that the data generating process is static over time. Bayesian filtering offers a rich toolkit for breaking this assumption, but the complex and high-dimensional space of neural networks poses several challenges. This thesis investigates these problems and presents progress for handling them. In the simple linear-Gaussian case, we show that we can effectively learn the dynamics and process noise of a time-varying hierarchical Bayesian linear regression model. This results in a simple linear regressor that can adapt its weights to fit lines without any data at forward timesteps. In the nonlinear case, we derive two methods for variational Kalman filtering, one based on conjugate gradients and the other on natural gradients. We find that we are able to produce promising results over simple networks but more work is required before we can learn dynamics and process noise in higher-dimensional, and more complex, parameter spaces. |
URI: | http://arks.princeton.edu/ark:/88435/dsp01d791sk48p |
Type of Material: | Academic dissertations (M.S.E.) |
Language: | en |
Appears in Collections: | Computer Science, 2023 |
Files in This Item:
File | Description | Size | Format | |
---|---|---|---|---|
Bencomo_princeton_0181G_14499.pdf | 2.31 MB | Adobe PDF | View/Download |
Items in Dataspace are protected by copyright, with all rights reserved, unless otherwise indicated.