Optimizing Information-theoretical Generalization Bound via Anisotropic Noise of SGLD

  • Bohan Wang ,
  • Huishuai Zhang ,
  • Jieyu Zhang ,
  • Qi Meng ,
  • Wei Chen ,
  • Tie-Yan Liu

NeurIPS 2021 |

Recently, the information-theoretical framework has been proven to be able to obtain non-vacuous generalization bounds for large models trained by Stochastic Gradient Langevin Dynamics (SGLD) with isotropic noise. In this paper, we optimize the information-theoretical generalization bound by manipulating the noise structure in SGLD. We prove that with constraint to guarantee low empirical risk, the optimal noise covariance is the square root of the expected gradient covariance if both the prior and the posterior are jointly optimized. This validates that the optimal noise is quite close to the empirical gradient covariance. Technically, we develop a new information-theoretical bound that enables such an optimization analysis. We then apply matrix analysis to derive the form of optimal noise covariance. Presented constraint and results are validated by the empirical observations.