Please use this identifier to cite or link to this item:
http://arks.princeton.edu/ark:/88435/dsp01v692t937q
Title: | The Solution Space of the Binary Perceptron Model and Inference of Sparse Graphons |
Authors: | Li, Shuangping |
Advisors: | AbbeSly, EmmanuelAllan |
Contributors: | Applied and Computational Mathematics Department |
Keywords: | graphon inference perceptron sharp threshold solution space |
Subjects: | Mathematics Statistics |
Issue Date: | 2022 |
Publisher: | Princeton, NJ : Princeton University |
Abstract: | This thesis focuses on the study of two problems in discrete probability theory motivated by modern machine learning applications. The first topic is the binary perceptron model, a simple model of neural networks that has gathered significant attention in the statistical physics, information theory and probability theory communities. For the symmetric binary perceptron model, we establish that the partition function of this model, normalized by its expected value, converges to a lognormal distribution. As a consequence, this allows us to establish several conjectures for this model, including contiguity, sharp threshold and strong freezing conjectures. The strong freezing property further points to an interesting phenomenon that is not present in sparse constraint satisfaction problems: on the one hand, the perceptron model is conjectured to be typically hard; on the other hand, efficient algorithms have been shown empirically to succeed in finding solutions, suggesting that such algorithms find atypical solutions. We formally establish such a phenomenon for both the symmetric and asymmetric binary perceptrons. We show that at low constraint density, there exists a subdominant connected cluster of solutions with almost maximal diameter, and that an efficient multiscale majority algorithm can find solutions in such a cluster with high probability. The second topic is learning sparse graphons. The problem of learning graphons has attracted considerable attention across several scientific communities, with significant progress over the recent years in sparser regimes. Yet, the current techniques still require diverging degrees in order to succeed with efficient algorithms in the challenging cases where the local structure of the graph is homogeneous. We provide an efficient algorithm to learn graphons in the constant expected degree regime. The algorithm is shown to succeed in estimating the rank-k projection of a graphon in the L2 metric if the top k eigenvalues of the graphon satisfy a generalized Kesten-Stigum condition. |
URI: | http://arks.princeton.edu/ark:/88435/dsp01v692t937q |
Alternate format: | The Mudd Manuscript Library retains one bound copy of each dissertation. Search for these copies in the library's main catalog: catalog.princeton.edu |
Type of Material: | Academic dissertations (Ph.D.) |
Language: | en |
Appears in Collections: | Applied and Computational Mathematics |
Files in This Item:
File | Description | Size | Format | |
---|---|---|---|---|
Li_princeton_0181D_14122.pdf | 1.8 MB | Adobe PDF | View/Download |
Items in Dataspace are protected by copyright, with all rights reserved, unless otherwise indicated.