Abstract: The problem of analyzing collections of sensitive personal information while providing privacy is decades old. A recent foundational examination of this problem has yielded differential privacy, a robust formal concept of privacy [Dwork, McSherry, Nissim, Smith 2006]. Differential privacy protects individuals from attackers that try to learn the information particular to them by requiring that an analysis' outcome would remain stable under any possible change in a single individual's input.
The research of differential privacy brings mathematical rigor to our understanding of privacy in computation. This research has uncovered some deep connections between differential privacy and research areas including learning theory, cryptography, complexity theory, algorithmic game theory, and statistics.
In this presentation, we will introduce differential privacy and look into some of these connections. In particular, we will show how differential privacy serves as a tool for data analysis, regardless whether the underlying data is privacy sensitive or not.