Streaming PCA: Matching Matrix Bernstein and Near-Optimal Finite Sample Guarantees for Oja’s Algorithm

  • Prateek Jain ,
  • Chi Jin ,
  • Sham Kakade ,
  • Praneeth Netrapalli ,
  • Aaron Sidford

Proceedings of The 29th Conference on Learning Theory (COLT) |

论文与出版物

This work provides improved guarantees for streaming principle component analysis (PCA). Given A1,,AnRd×d sampled independently from distributions satisfying E[Ai]=Σ for Σ0, this work provides an O(d)-space linear-time single-pass streaming algorithm for estimating the top eigenvector of Σ. The algorithm nearly matches (and in certain cases improves upon) the accuracy obtained by the standard batch method that computes top eigenvector of the empirical covariance 1ni[n]Ai as analyzed by the matrix Bernstein inequality. Moreover, to achieve constant accuracy, our algorithm improves upon the best previous known sample complexities of streaming algorithms by either a multiplicative factor of O(d) or 1/gap where gap is the relative distance between the top two eigenvalues of Σ.
These results are achieved through a novel analysis of the classic Oja’s algorithm, one of the oldest and most popular algorithms for streaming PCA. In particular, this work shows that simply picking a random initial point w0 and applying the update rule wi+1=wi+ηiAiwi suffices to accurately estimate the top eigenvector, with a suitable choice of ηi. We believe our result sheds light on how to efficiently perform streaming PCA both in theory and in practice and we hope that our analysis may serve as the basis for analyzing many variants and extensions of streaming PCA.