Towards a Theory of Non-Log-Concave Sampling: First-Order Stationarity Guarantees for Langevin Monte Carlo
- Krishnakumar Balasubramanian ,
- Sinho Chewi ,
- Murat A. Erdogdu ,
- Adil Salim ,
- Matthew Zhang
For the task of sampling from a density \(π∝exp(−V)\) on \(R^d\), where \(V\) is possibly non-convex but \(L\)-gradient Lipschitz, we prove that averaged Langevin Monte Carlo outputs a sample with \(ε\)-relative Fisher information after \(O(L^2d^2/ε^2)\) iterations. This is the sampling analogue of complexity bounds for finding an \(ε\)-approximate first-order stationary points in non-convex optimization and therefore constitutes a first step towards the general theory of non-log-concave sampling. We discuss numerous extensions and applications of our result; in particular, it yields a new state-of-the-art guarantee for sampling from distributions which satisfy a Poincaré inequality.