Phase Retrieval using Alternating Minimization
- Praneeth Netrapalli ,
- Prateek Jain ,
- Sujay Sanghavi
IEEE Transactions on Signal Processing 2015. An extended abstract appeared in 27th Annual Conference on Neural Information Processing Systems 2013. |
Phase retrieval problems involve solving linear equations, but with missing sign (or phase, for complex numbers) information. More than four decades after it was first proposed, the seminal error reduction algorithm of (Gerchberg and Saxton 1972) and (Fienup 1982) is still the popular choice for solving many variants of this problem. The algorithm is based on alternating minimization; i.e. it alternates between estimating the missing phase information, and the candidate solution. Despite its wide usage in practice, no global convergence guarantees for this algorithm are known. In this paper, we show that a (resampling) variant of this approach converges geometrically to the solution of one such problem — finding a vector x from y,A, where y=∣∣A’x∣∣ and |z| denotes a vector of element-wise magnitudes of z — under the assumption that A is Gaussian.
Empirically, we demonstrate that alternating minimization performs similar to recently proposed convex techniques for this problem (which are based on «lifting» to a convex matrix problem) in sample complexity and robustness to noise. However, it is much more efficient and can scale to large problems. Analytically, for a resampling version of alternating minimization, we show geometric convergence to the solution, and sample complexity that is off by log factors from obvious lower bounds. We also establish close to optimal scaling for the case when the unknown vector is sparse. Our work represents the first theoretical guarantee for alternating minimization (albeit with resampling) for any variant of phase retrieval problems in the non-convex setting.