Computer Science – Information Theory
Scientific paper
2010-12-20
Computer Science
Information Theory
Scientific paper
Differential privacy is a notion of privacy that has become very popular in the database community. Roughly, the idea is that a randomized query mechanism provides sufficient privacy protection if the ratio between the probabilities of two different entries to originate a certain answer is bound by e^\epsilon. In the fields of anonymity and information flow there is a similar concern for controlling information leakage, i.e. limiting the possibility of inferring the secret information from the observables. In recent years, researchers have proposed to quantify the leakage in terms of the information-theoretic notion of mutual information. There are two main approaches that fall in this category: One based on Shannon entropy, and one based on R\'enyi's min entropy. The latter has connection with the so-called Bayes risk, which expresses the probability of guessing the secret. In this paper, we show how to model the query system in terms of an information-theoretic channel, and we compare the notion of differential privacy with that of mutual information. We show that the notion of differential privacy is strictly stronger, in the sense that it implies a bound on the mutual information, but not viceversa.
Alvim Mário S.
Chatzikokolakis Konstantinos
Degano Pierpaolo
Palamidessi Catuscia
No associations
LandOfFree
Differential Privacy versus Quantitative Information Flow does not yet have a rating. At this time, there are no reviews or comments for this scientific paper.
If you have personal experience with Differential Privacy versus Quantitative Information Flow, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Differential Privacy versus Quantitative Information Flow will most certainly appreciate the feedback.
Profile ID: LFWR-SCP-O-452927