Lawyers and computer scientists hold very different notions of privacy.
Privacy laws rely on narrower and less formal conceptions of risk than those
described by the computer science literature. As a result, the law often
creates uncertainty and fails to protect against the full range of data
privacy risks. In contrast, mathematical concepts such as differential
privacy provide a quantifiable, robust guarantee of privacy against a wide
range of potential attacks, including types of attacks currently unknown or
unforeseen.

The subject of much theoretical investigation, differential privacy has
recently been making significant strides towards practical implementation.
However, because the law generally relies on very different methods for
mitigating risk, a significant challenge to implementation will be
demonstrating that the new privacy technologies satisfy legal requirements
for privacy protection. In particular, most privacy laws focus on the
identifiability of data, or the ability to link an individual to a record in
a release of data. In doing so, they often equate privacy with heuristic
ג€œde-identificationג€ approaches and provide little guidance for implementing
more formal privacy-preserving techniques.

In this talk, we will articulate the gap between legal and technical
approaches to privacy and present a methodology for formally proving that a
technological method for privacy protection satisfies the requirements of a
particular law. This methodology involves two steps: first, translating a
legal standard into a formal mathematical requirement of privacy and,
second, constructing a rigorous proof for establishing that a technique
satisfies the mathematical requirement derived from the law. We will walk
through an example applying this new methodology to bridge the requirements
of the Family Educational Rights and Privacy Act (FERPA) and differential
privacy. 

This talk summarizes early results from ongoing research by Kobbi Nissim,
Aaron Bembenek, Mark Bun, Marco Gaboardi, and Salil Vadhan from the Center
for Research on Computation and Society, together with Urs Gasser, David
Oג€™Brien, and Alexandra Wood from the Berkman Center for Internet & Society.
Further work building from this approach is anticipated to form the basis of
a future publication. This research is also part of a broader collaboration
through Harvard's Privacy Tools for Sharing Research Data project, which
aims to build legal and technical tools, such as tools for differentially
private statistical analysis, to help enable the wider sharing of social
science research data while protecting the privacy of individuals.