25,534 research outputs found

    Local Differentially Private Matrix Factorization with MoG for Recommendations

    Get PDF
    Unethical data aggregation practices of many recommendation systems have raised privacy concerns among users. Local differential privacy (LDP) based recommendation systems address this problem by perturbing a user’s original data locally in their device before sending it to the data aggregator (DA). The DA performs recommendations over perturbed data which causes substantial prediction error. To tackle privacy and utility issues with untrustworthy DA in recommendation systems, we propose a novel LDP matrix factorization (MF) with mixture of Gaussian (MoG). We use a Bounded Laplace mechanism (BLP) to perturb user’s original ratings locally. BLP restricts the perturbed ratings to a predefined output domain, thus reducing the level of noise aggregated at DA. The MoG method estimates the noise added to the original ratings, which further improves the prediction accuracy without violating the principles of differential privacy (DP). With Movielens and Jester datasets, we demonstrate that our method offers a higher prediction accuracy under strong privacy protection compared to existing LDP recommendation methods

    A Lightweight Privacy-Preserved Spatial and Temporal Aggregation of Energy Data

    Get PDF
    Smart grid provides fine-grained real time energy consumption, and it is able to improve the efficiency of energy management. It enables the collection of energy consumption data from consumer and hence has raised serious privacy concerns. Energy consumption data, a form of personal information that reveals behavioral patterns can be used to identify electrical appliances being used by the user through the electricity load signature, thus making it possible to further reveal the residency pattern of a consumer’s household or appliances usage habit. This paper proposes to enhance the privacy of energy con- sumption data by enabling the utility to retrieve the aggregated spatial and temporal consumption without revealing individual energy consumption. We use a lightweight cryptographic mech- anism to mask the energy consumption data by adding random noises to each energy reading and use Paillier’s additive homo- morphic encryption to protect the noises. When summing up the masked energy consumption data for both Spatial and Temporal aggregation, the noises cancel out each other, hence resulting in either the total sum of energy consumed in a neighbourhood at a particular time, or the total sum of energy consumed by a household in a day. No third party is able to derive the energy consumption pattern of a household in real time. A proof-of- concept was implemented to demonstrate the feasibility of the system, and the results show that the system can be efficiently deployed on a low-cost computing platform

    Privately Connecting Mobility to Infectious Diseases via Applied Cryptography

    Get PDF
    Human mobility is undisputedly one of the critical factors in infectious disease dynamics. Until a few years ago, researchers had to rely on static data to model human mobility, which was then combined with a transmission model of a particular disease resulting in an epidemiological model. Recent works have consistently been showing that substituting the static mobility data with mobile phone data leads to significantly more accurate models. While prior studies have exclusively relied on a mobile network operator's subscribers' aggregated data, it may be preferable to contemplate aggregated mobility data of infected individuals only. Clearly, naively linking mobile phone data with infected individuals would massively intrude privacy. This research aims to develop a solution that reports the aggregated mobile phone location data of infected individuals while still maintaining compliance with privacy expectations. To achieve privacy, we use homomorphic encryption, zero-knowledge proof techniques, and differential privacy. Our protocol's open-source implementation can process eight million subscribers in one and a half hours. Additionally, we provide a legal analysis of our solution with regards to the EU General Data Protection Regulation.Comment: Added differentlial privacy experiments and new benchmark

    PriPeARL: A Framework for Privacy-Preserving Analytics and Reporting at LinkedIn

    Full text link
    Preserving privacy of users is a key requirement of web-scale analytics and reporting applications, and has witnessed a renewed focus in light of recent data breaches and new regulations such as GDPR. We focus on the problem of computing robust, reliable analytics in a privacy-preserving manner, while satisfying product requirements. We present PriPeARL, a framework for privacy-preserving analytics and reporting, inspired by differential privacy. We describe the overall design and architecture, and the key modeling components, focusing on the unique challenges associated with privacy, coverage, utility, and consistency. We perform an experimental study in the context of ads analytics and reporting at LinkedIn, thereby demonstrating the tradeoffs between privacy and utility needs, and the applicability of privacy-preserving mechanisms to real-world data. We also highlight the lessons learned from the production deployment of our system at LinkedIn.Comment: Conference information: ACM International Conference on Information and Knowledge Management (CIKM 2018

    How to Balance Privacy and Money through Pricing Mechanism in Personal Data Market

    Full text link
    A personal data market is a platform including three participants: data owners (individuals), data buyers and market maker. Data owners who provide personal data are compensated according to their privacy loss. Data buyers can submit a query and pay for the result according to their desired accuracy. Market maker coordinates between data owner and buyer. This framework has been previously studied based on differential privacy. However, the previous study assumes data owners can accept any level of privacy loss and data buyers can conduct the transaction without regard to the financial budget. In this paper, we propose a practical personal data trading framework that is able to strike a balance between money and privacy. In order to gain insights on user preferences, we first conducted an online survey on human attitude to- ward privacy and interest in personal data trading. Second, we identify the 5 key principles of personal data market, which is important for designing a reasonable trading frame- work and pricing mechanism. Third, we propose a reason- able trading framework for personal data which provides an overview of how the data is traded. Fourth, we propose a balanced pricing mechanism which computes the query price for data buyers and compensation for data owners (whose data are utilized) as a function of their privacy loss. The main goal is to ensure a fair trading for both parties. Finally, we will conduct an experiment to evaluate the output of our proposed pricing mechanism in comparison with other previously proposed mechanism
    • …
    corecore