Skip navigation
Please use this identifier to cite or link to this item:
Title: Towards Flexible Active And Online Learning With Neural Networks
Authors: Ash, Jordan
Advisors: Adams, Ryan P
Contributors: Computer Science Department
Keywords: active learning
deep learning
machine learning
online learning
Subjects: Artificial intelligence
Issue Date: 2020
Publisher: Princeton, NJ : Princeton University
Abstract: Deep learning has elicited breakthrough successes on a wide array of machine learning tasks. Outside of the fully-supervised regime, however, many deep learning algorithms are brittle and unable to reliably perform across model architectures, dataset types, and optimization parameters. As a consequence, these algorithms are not easily usable by non-machine-learning experts, limiting their ability to meaningfully impact science and society. This thesis addresses some nuanced pathologies around the use of deep learning for active and passive online learning. We propose a practical active learning approach for neural networks that is robust to environmental variables: Batch Active learning by Diverse Gradient Embeddings (BADGE). We also discuss the deleterious generalization effects of warm-starting the optimization of neural networks in sequential environments, and why this is a major problem for deep learning. We introduce a simple method that remedies this problem, and discuss some important ramifications of its application.
Alternate format: The Mudd Manuscript Library retains one bound copy of each dissertation. Search for these copies in the library's main catalog:
Type of Material: Academic dissertations (Ph.D.)
Language: en
Appears in Collections:Computer Science

Files in This Item:
File Description SizeFormat 
Ash_princeton_0181D_13526.pdf48.56 MBAdobe PDFView/Download

Items in Dataspace are protected by copyright, with all rights reserved, unless otherwise indicated.