Skip navigation
Please use this identifier to cite or link to this item: http://arks.princeton.edu/ark:/88435/dsp014t64gr02n
Title: Efficiently Escaping Saddle Points on Manifolds
Authors: Criscitiello, Christopher
Advisors: Boumal, Nicolas
Department: Mathematics
Class Year: 2019
Abstract: We generalize Jin et al.'s perturbed gradient descent algorithm, PGD, to Riemannian manifolds (for Jin et al's work, see [How to Escape Saddle Points Efficiently (2017), Stochastic Gradient Descent Escapes Saddle Points Efficiently (2019)]). For an arbitrary Riemannian manifold $\calM$ of dimension $d$, a sufficiently smooth nonconvex objective function $f$ and weak conditions on the chosen retraction, our algorithm perturbed Riemannian gradient descent, PRGD, achieves an $\epsilon$-second-order critical point in $O((\log{d})^4 / \epsilon^{2})$ gradient queries, matching the complexity achieved by perturbed gradient descent in the Euclidean case. Like PGD, PRGD does not require Hessian information and only has polylogarithmic dependence on dimension $d$. This is important for applications involving optimization on manifolds in large dimension, including PCA, low-rank matrix completion, etc. Our key idea is to distinguish between two types of gradient steps: ``steps on the manifold'' and ``steps in a tangent space'' of the manifold. This idea allows us to seamlessly extend Jin et al.'s analysis.
URI: http://arks.princeton.edu/ark:/88435/dsp014t64gr02n
Type of Material: Princeton University Senior Theses
Language: en
Appears in Collections:Mathematics, 1934-2023

Files in This Item:
File Description SizeFormat 
CRISCITIELLO-CHRISTOPHER-THESIS.pdf339.84 kBAdobe PDF    Request a copy


Items in Dataspace are protected by copyright, with all rights reserved, unless otherwise indicated.