Skip navigation
Please use this identifier to cite or link to this item:
Title: Efficient Estimation and Inference in Nonconvex Low-Complexity Models
Authors: Cai, Changxiao
Advisors: Chen, Yuxin
Poor, H. Vincent
Contributors: Electrical Engineering Department
Keywords: nonconvex optimization
principal component analysis
spectral methods
tensor completion
Subjects: Electrical engineering
Issue Date: 2021
Publisher: Princeton, NJ : Princeton University
Abstract: Low-complexity models serve as a pivotal tool for extraction of key information from large-scale data, spanning a varied array of machine learning applications. However, due to the limits of computation and the nonconvexity issue in high dimensions, modern data analysis calls for new procedures that allow significant reduction of sample size and computational costs, while at the same time preserving near-optimal statistical accuracy. This thesis is devoted to development of efficient estimation and inference methods for low-rank models, and the exploration of theoretical foundations underlying these approaches. We start with statistical estimation of the column space of an unknown matrix given noisy and partial observations, and focus on the highly unbalanced case where the column dimension far exceeds the row dimension. We investigate an efficient spectral method and establish near-optimal statistical guarantees in terms of both $\ell_2$ and $\ell_{2,\infty}$ estimation accuracy. When applied to concrete statistical applications---tensor completion, principal component analysis and community recovery---the general framework leads to significant performance improvement over prior literature. Moving beyond matrix-type data, we study a natural higher-order generalization---noisy tensor completion. Given that existing methods either are computationally expensive or fail to achieve statistical optimal performance, we propose a two-stage nonconvex algorithm achieving near-optimal computational efficiency (i.e. linear time complexity) and statistical accuracy (i.e. minimal sample complexity and optimal estimation accuracy) at once. In addition to estimation, we further characterize the non-asymptotic distribution of the proposed nonconvex estimator down to fine scales, and develop a data-driven inferential procedure to construct optimal entrywise confidence intervals for the unknowns, which fully adapts to unknown noise distributions and noise heteroscedasticity. As a byproduct, the distributional theory justifies the statistical optimality of the nonconvex estimator---its $\ell_2$ estimation error is un-improvable including the pre-constant. All of this is attained through the integrated consideration of statistics and nonconvex optimization, and fine-grained analysis of spectral methods.
Alternate format: The Mudd Manuscript Library retains one bound copy of each dissertation. Search for these copies in the library's main catalog:
Type of Material: Academic dissertations (Ph.D.)
Language: en
Appears in Collections:Electrical Engineering

Files in This Item:
File Description SizeFormat 
Cai_princeton_0181D_13657.pdf3.7 MBAdobe PDFView/Download

Items in Dataspace are protected by copyright, with all rights reserved, unless otherwise indicated.