Skip navigation
Please use this identifier to cite or link to this item:
Title: Estimation Error For Regression and Optimal Convergence Rate
Authors: Wang, Yao
Advisors: E, Weinan
Contributors: Mathematics Department
Subjects: Applied mathematics
Issue Date: 2018
Publisher: Princeton, NJ : Princeton University
Abstract: In this thesis, we study the optimal convergence rate for the universal estimation error. Let F be the excess loss class associated with the hypothesis space and n be the size of the data set, we prove that if the Fat-shattering dimension satisfies fat(F) = O(n^p), then the universal estimation error is of O(n^{1/2}) for p < 2 and O(n^{1/p}) for p > 2. Among other things, this result gives a criterion for a hypothesis class to achieve the minmax optimal rate of O(n^{1/2}). Examples are also provided for optimal rates not equal to O(n^{1/p}), such as compact supported convex Lipschitz continuous functions in Rd with d > 4 with optimal rate approximately about O(n^{2/d}). Training in practice may only explore a certain subspace in F. It is useful to bound the complexity of the subspace explored instead of the whole F. This is done for the gradient descent method.
Alternate format: The Mudd Manuscript Library retains one bound copy of each dissertation. Search for these copies in the library's main catalog:
Type of Material: Academic dissertations (Ph.D.)
Language: en
Appears in Collections:Mathematics

Files in This Item:
File Description SizeFormat 
Wang_princeton_0181D_12599.pdf345.14 kBAdobe PDFView/Download

Items in Dataspace are protected by copyright, with all rights reserved, unless otherwise indicated.