926 research outputs found

    An Accuracy-Assured Privacy-Preserving Recommender System for Internet Commerce

    Full text link
    Recommender systems, tool for predicting users' potential preferences by computing history data and users' interests, show an increasing importance in various Internet applications such as online shopping. As a well-known recommendation method, neighbourhood-based collaborative filtering has attracted considerable attention recently. The risk of revealing users' private information during the process of filtering has attracted noticeable research interests. Among the current solutions, the probabilistic techniques have shown a powerful privacy preserving effect. When facing kk Nearest Neighbour attack, all the existing methods provide no data utility guarantee, for the introduction of global randomness. In this paper, to overcome the problem of recommendation accuracy loss, we propose a novel approach, Partitioned Probabilistic Neighbour Selection, to ensure a required prediction accuracy while maintaining high security against kkNN attack. We define the sum of kk neighbours' similarity as the accuracy metric alpha, the number of user partitions, across which we select the kk neighbours, as the security metric beta. We generalise the kk Nearest Neighbour attack to beta k Nearest Neighbours attack. Differing from the existing approach that selects neighbours across the entire candidate list randomly, our method selects neighbours from each exclusive partition of size kk with a decreasing probability. Theoretical and experimental analysis show that to provide an accuracy-assured recommendation, our Partitioned Probabilistic Neighbour Selection method yields a better trade-off between the recommendation accuracy and system security.Comment: replacement for the previous versio

    PrivacyCanary: Privacy-aware recommenders with adaptive input obfuscation

    Get PDF
    Abstract—Recommender systems are widely used by online retailers to promote products and content that are most likely to be of interest to a specific customer. In such systems, users often implicitly or explicitly rate products they have consumed, and some form of collaborative filtering is used to find other users with similar tastes to whom the products can be recommended. While users can benefit from more targeted and relevant recom-mendations, they are also exposed to greater risks of privacy loss, which can lead to undesirable financial and social consequences. The use of obfuscation techniques to preserve the privacy of user ratings is well studied in the literature. However, works on obfuscation typically assume that all users uniformly apply the same level of obfuscation. In a heterogeneous environment, in which users adopt different levels of obfuscation based on their comfort level, the different levels of obfuscation may impact the users in the system in a different way. In this work we consider such a situation and make the following contributions: (a) using an offline dataset, we evaluate the privacy-utility trade-off in a system where a varying portion of users adopt the privacy preserving technique. Our study highlights the effects that each user’s choices have, not only on their own experience but also on the utility that other users will gain from the system; and (b) we propose PrivacyCanary, an interactive system that enables users to directly control the privacy-utility trade-off of the recommender system to achieve a desired accuracy while maximizing privacy protection, by probing the system via a private (i.e., undisclosed to the system) set of items. We evaluate the performance of our system with an off-line recommendations dataset, and show its effectiveness in balancing a target recommender accuracy with user privacy, compared to approaches that focus on a fixed privacy level. I
    • …
    corecore