unknown

A privacy-protecting architecture for recommendation systems via the suppression of ratings

Abstract

Recommendation systems are information-filtering systems that help users deal with information overload. Unfortunately, current recommendation systems prompt serious privacy concerns. In this work, we propose an architecture that enables users to enhance their privacy in those systems that profile users on the basis of the items rated. Our approach capitalizes on a conceptually-simple perturbative technique, namely the suppression of ratings. In our scenario, users rate those items they have an opinion on. However, in order to avoid being accurately profiled, they may want to refrain from rating certain items. Consequently, this technique protects user privacy to a certain extent, but at the cost of a degradation in the accuracy of the recommendation. We measure privacy risk as the Kullback-Leibler divergence between the user's and the population's rating distribution, a privacy criterion that we proposed in previous work. The justification of such a criterion is our second contribution. Concretely, we thoroughly interpret it by elaborating on the intimate connection between the celebrated method of entropy maximization and the use of entropies and divergences as measures of privacy. The ultimate purpose of this justification is to attempt to bridge the gap between the privacy and the information-theoretic communities by substantially adapting some technicalities of our original work to reach a wider audience, not intimately familiar with information theory and the method of types. Lastly, we present a formulation of the optimal trade-o_ between privacy and suppression rate, what allows us to formally specify one of the functional blocks of the proposed architecture.Peer ReviewedPreprin

    Similar works