1 research outputs found
Small Count Privacy and Large Count Utility in Data Publishing
While the introduction of differential privacy has been a major breakthrough
in the study of privacy preserving data publication, some recent work has
pointed out a number of cases where it is not possible to limit inference about
individuals. The dilemma that is intrinsic in the problem is the simultaneous
requirement of data utility in the published data. Differential privacy does
not aim to protect information about an individual that can be uncovered even
without the participation of the individual. However, this lack of coverage may
violate the principle of individual privacy. Here we propose a solution by
providing protection to sensitive information, by which we refer to the answers
for aggregate queries with small counts. Previous works based on
-diversity can be seen as providing a special form of this kind of
protection. Our method is developed with another goal which is to provide
differential privacy guarantee, and for that we introduce a more refined form
of differential privacy to deal with certain practical issues. Our empirical
studies show that our method can preserve better utilities than a number of
state-of-the-art methods although these methods do not provide the protections
that we provide.Comment: 12 pages, 12 figure