Statistical Guarantees for Alternating Minimization
Alternating minimization (AltMin) is a generic term for a widely popular approach in non-convex inference: often, it is possible to partition the variables into two (or more) sets, so that the problem is convex/tractable in one set if the other is held fixed (and vice versa). This allows for alternating between optimally updating one set of variables, and then the other. AltMin methods typically do not have associated global consistency guarantees; even though they are empirically observed to perform better than methods (e.g. based on convex optimization) that do have guarantees. In this talk, we will present rigorous performance guarantees for AltMin in three statistical inference settings: low rank matrix completion, phase retrieval and learning sparsely-used dictionaries. The overarching theme behind our results consists of two parts: (i) devising new initialization procedures (as opposed to doing so randomly, as is typical), and (ii) establishing exponential convergence from this initialization. Our work shows that the pursuit of statistical guarantees can yield algorithmic improvements (initialization in our case) that perform measurably better in practice.
发言人详细信息
Praneeth Netrapalli is a PhD student in the ECE department at UT Austin, where he is advised by Prof. Sujay Sanghavi. His research focuses on designing and understanding practical algorithms for solving machine learning problems. He obtained MS from UT Austin, and B. Tech from IIT Bombay, both in Electrical Engineering. His prior experience includes two years as a quantitative analyst at Goldman Sachs where he worked on pricing and evaluating risk on complex financial products.
- 日期:
- 演讲者:
- Praneeth Netrapalli
- 所属机构:
- University of Texas
-
-
Jeff Running
-
Praneeth Netrapalli
Researcher
-
-