Data-Driven Model Reduction, Scientific Frontiers, and Applications ()
- Texas A&M University
- College Station, TX
- Joe C. Richardson Petroleum Engineering Building (RICH) 910
David Huckleberry Gutman, Industrial and Systems Engineering
Tangent Subspace Descent via Discontinuous Subspace Selections on Fixed-Rank Manifolds
Abstract
The tangent subspace descent method (TSD) extends the coordinate descent algorithm to manifold domains. The key insight underlying TSD is to draw an analogy between coordinate blocks in Euclidean space and tangent subspaces of a manifold. The core principle behind ensuring convergence of TSD for smooth functions is the appropriate choice of subspace at each iteration. Previously, it was shown that it is always possible to appropriately pick such subspaces on the broad class of manifolds known as naturally reductive homogeneous spaces. In this talk, we provide the first instances of TSD for manifolds outside of this class. The main idea underlying these new instances is the use of discontinuous subspace selections. As a result of our developments we derive new and efficient methods for large-scale optimization on the fixed-rank and fixed-rank positive semidefinite matrix manifolds.