Private Convex Optimization via Exponential Mechanism
- Sivakanth Gopi ,
- Yin Tat Lee ,
- Daogao Liu
In this paper, we study private optimization problems for non-smooth convex functions \(F(x)=\mathbb{E}_i f_i(x)\) on \(\mathbb{R}^d\). We show that modifying the exponential mechanism by adding an \(\ell_2^2\) regularizer to \(F(x)\) and sampling from \(\pi(x)\propto \exp(-k(F(x)+\mu\|x\|_2^2/2))\) recovers both the known optimal empirical risk and population loss under \((ϵ,δ)\)-DP. Furthermore, we show how to implement this mechanism using \(O˜(nmin(d,n))\) queries to \(f_i(x)\) for the DP-SCO where \(n\) is the number of samples/users and \(d\) is the ambient dimension. We also give a (nearly) matching lower bound \(Ω˜(nmin(d,n))\) on the number of evaluation queries.
Our results utilize the following tools that are of independent interest: (1) We prove Gaussian Differential Privacy (GDP) of the exponential mechanism if the loss function is strongly convex and the perturbation is Lipschitz. Our privacy bound is optimal as it includes the privacy of Gaussian mechanism as a special case and is proved using the isoperimetric inequality for strongly log-concave measures. (2) We show how to sample from \(exp(−F(x)−μ∥x∥2^2/2)\) for \(G\)-Lipschitz \(F\) with \(η\) error in total variation (TV) distance using \(O˜((G^2/μ)log^2(d/η))\) unbiased queries to \(F(x)\). This is the first sampler whose query complexity has <em>polylogarithmic dependence</em> on both dimension \(d\) and accuracy \(η\).