Skip navigation
Please use this identifier to cite or link to this item: http://arks.princeton.edu/ark:/88435/dsp01dv13zx44x
Title: High-Dimensional Statistics under Covariance Eigenvalue Decay
Authors: Silin, Igor
Advisors: Fan, Jianqing
Contributors: Operations Research and Financial Engineering Department
Subjects: Statistics
Issue Date: 2022
Publisher: Princeton, NJ : Princeton University
Abstract: High-dimensional data are ubiquitous in modern applications. In order to develop successful statistical methods in high dimensions, one inevitably needs to impose some structural assumptions on the underlying data generating model.Existing statistical literature mostly concerns a sparsity or a low-rank structure of the data. In contrast, this dissertation is devoted to another type of structural assumption -- eigenvalue decay of the covariance matrix of the data. Credibility of this condition has been confirmed for numerous real-world datasets in applied statistical research. We demonstrate how the covariance eigenvalue decay can be used to design new methods and prove dimension-free error bounds for several fundamental statistical problems. We start with a linear regression setting, and present an estimator which exploits the covariance eigenvalue decay. Our approach leads to a new perspective on the high-dimensional linear regression problem, where new notions of joint effective dimension and signal-to-noise ratio play a crucial role in controlling the relative error of our estimator, arguably a more suitable metric for the problem. Further, our methodology is extended to generalized linear models, which additionally cover binary classification, Poisson regression, and gamma regression (regression with nonnegative continuous response). Beyond the linear models, we apply our procedure in reproducing kernel Hilbert space, which leads to new estimators for nonparametric regression and classification. Along with the supervised statistical problems, we also consider an unsupervised setting, in particular, a problem of hypothesis testing for eigenspaces of covariance matrix. New tests for principal directions of the data in both one-sample and two-sample scenarios are proposed. In the context of this problem, the covariance eigenvalue decay allows to improve theoretical guarantees from prior literature.
URI: http://arks.princeton.edu/ark:/88435/dsp01dv13zx44x
Type of Material: Academic dissertations (Ph.D.)
Language: en
Appears in Collections:Operations Research and Financial Engineering

Files in This Item:
File Description SizeFormat 
Silin_princeton_0181D_14326.pdf1.47 MBAdobe PDFView/Download


Items in Dataspace are protected by copyright, with all rights reserved, unless otherwise indicated.