Banner home
Thought Leadership

Differential Privacy: What is all the noise about?

Roxana Danger of RegulAItion’s data science team provides a technically oriented walk-through of Differential Privacy in Machine Learning, with a particular eye towards Federated Learning.

Differential Privacy is a formal definition of privacy that provides rigorous guarantees against risks of privacy breaches during data processing. It makes no assumptions about the knowledge or computational power of adversaries, and provides an interpretable, quantifiable and composable formalism. Differential Privacy has been actively researched during the last 15 years, but it is still difficult to master for many Machine Learning practitioners.

This paper aims to provide an overview of the most important ideas, concepts and uses of Differential Privacy in Machine Learning, with special focus on its intersection with Federated Learning.

The paper may be accessed on ARXIV here.