Skip navigation
Please use this identifier to cite or link to this item: http://arks.princeton.edu/ark:/88435/dsp01pn89d9370
Title: Optimization of Mutual Information in Learning: Explorations in Science
Authors: Strouse, Daniel
Advisors: Schwab, David J
Bialek, William
Contributors: Physics Department
Subjects: Artificial intelligence
Issue Date: 2018
Publisher: Princeton, NJ : Princeton University
Abstract: This thesis explores three applications of information theory in machine learning, all involving the optimization of information flow in some learning problem. In ChapterĀ 2, we introduce a method for extracting the most informative bits that one signal contains about another. Our method, the deterministic information bottleneck (DIB), is an alternative formulation of the information bottleneck (IB). In Chapter 3, we adapt the DIB to the problem of finding the most informative clusterings of geometric data. We also introduce an approach to model selection that naturally emerges within the (D)IB framework. In Chapter 4 we introduce an approach to encourage / discourage agents in a multi-agent reinforcement learning setting to share information with one another. We conclude in Chapter 5 by discussing ongoing and future work in these directions.
URI: http://arks.princeton.edu/ark:/88435/dsp01pn89d9370
Alternate format: The Mudd Manuscript Library retains one bound copy of each dissertation. Search for these copies in the library's main catalog: catalog.princeton.edu
Type of Material: Academic dissertations (Ph.D.)
Language: en
Appears in Collections:Physics

Files in This Item:
File Description SizeFormat 
Strouse_princeton_0181D_12733.pdf10.83 MBAdobe PDFView/Download


Items in Dataspace are protected by copyright, with all rights reserved, unless otherwise indicated.