Computer Science – Databases
Scientific paper
2010-11-10
Computer Science
Databases
Scientific paper
Over the last decade there have been great strides made in developing techniques to compute functions privately. In particular, Differential Privacy gives strong promises about conclusions that can be drawn about an individual. In contrast, various syntactic methods for providing privacy (criteria such as kanonymity and l-diversity) have been criticized for still allowing private information of an individual to be inferred. In this report, we consider the ability of an attacker to use data meeting privacy definitions to build an accurate classifier. We demonstrate that even under Differential Privacy, such classifiers can be used to accurately infer "private" attributes in realistic data. We compare this to similar approaches for inferencebased attacks on other forms of anonymized data. We place these attacks on the same scale, and observe that the accuracy of inference of private attributes for Differentially Private data and l-diverse data can be quite similar.
No associations
LandOfFree
Individual Privacy vs Population Privacy: Learning to Attack Anonymization does not yet have a rating. At this time, there are no reviews or comments for this scientific paper.
If you have personal experience with Individual Privacy vs Population Privacy: Learning to Attack Anonymization, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Individual Privacy vs Population Privacy: Learning to Attack Anonymization will most certainly appreciate the feedback.
Profile ID: LFWR-SCP-O-429906