A Universal Transfer Theorem for Convex Optimization Algorithms Using Inexact First-order Oracles
- Amitabh Basu ,
- Hongyi Jiang ,
- Phillip Kerger ,
- Marco Molinaro
2024 International Conference on Machine Learning |
Given any algorithm for convex optimization that uses exact first-order information (i.e., function values and subgradients), we show how to use such an algorithm to solve the problem with access to inexact first-order information. This is done in a “black-box” manner without knowledge of the internal workings of the algorithm. This complements work done by Devolder-Glineur-Nesterov and Schmidt-Le Roux-Bach who consider the performance of specific algorithms like (accelerated) gradient descent with inexact information. In particular, our results apply to a wider range of algorithms beyond variants of gradient descent, e.g., projection-free methods, cutting-plane methods, or any other first-order methods formulated in the future. Further, they also apply to algorithms that handle structured nonconvexities like mixed-integer decision variables.