The Promise of Differential Privacy. A Tutorial on Algorithmic Techniques.
Differential privacy describes a promise, made by a data curator to a data subject: you will not be affected, adversely or otherwise, by allowing your data to be used in any study, no matter what other studies, data sets, or information from other sources is available. At their best, differentially private database mechanisms can make confidential data widely available for accurate data analysis, without resorting to data clean rooms, institutional review boards, data usage agreements, restricted views, or data protection plans. To enjoy the fruits of the research described in this tutorial, the data analyst must accept that raw data can never be accessed directly and that eventually data utility is consumed: overly accurate answers to too many questions will destroy privacy. The goal of algorithmic research on differential privacy is to postpone this inevitability as long as possible.