4 research outputs found

    Privacy-Aware Recommendation with Private-Attribute Protection using Adversarial Learning

    Full text link
    Recommendation is one of the critical applications that helps users find information relevant to their interests. However, a malicious attacker can infer users' private information via recommendations. Prior work obfuscates user-item data before sharing it with recommendation system. This approach does not explicitly address the quality of recommendation while performing data obfuscation. Moreover, it cannot protect users against private-attribute inference attacks based on recommendations. This work is the first attempt to build a Recommendation with Attribute Protection (RAP) model which simultaneously recommends relevant items and counters private-attribute inference attacks. The key idea of our approach is to formulate this problem as an adversarial learning problem with two main components: the private attribute inference attacker, and the Bayesian personalized recommender. The attacker seeks to infer users' private-attribute information according to their items list and recommendations. The recommender aims to extract users' interests while employing the attacker to regularize the recommendation process. Experiments show that the proposed model both preserves the quality of recommendation service and protects users against private-attribute inference attacks.Comment: The Thirteenth ACM International Conference on Web Search and Data Mining (WSDM 2020

    Safeguarding Personal Data: Meta Consent as a Remedy to Section 28(2)(c) of Kenya’s Data Protection Act

    Get PDF
    Biometric identity systems have been adopted in the Global South, following the Global North’s lead. The greatest discrepancy, however, is the existence of legal frameworks that govern the use, storage and processing of the data collected. The Kenyan government’s roll-out of the Huduma Namba registration exercise in April 2019 with no existing data protection law in Kenya exemplifies this. Thereafter, Parliament passed the Data Protection Act. Unfortunately, parts of this law are not keen enough to protect personal data. Deviating from the requirement for personal data to be directly collected from the data subject, section 28(2)(c) of the referenced Act permits indirect collection of personal data from a source other than the data subject themselves. Relying on desk-based research and using the Huduma Namba exercise as a case study, this paper examines this permission and the imminent danger it poses to privacy of the personal data of Kenyans. Finding that section 28(2)(c) exposes personal data to the privacy violations of secondary use and exclusion threatens the right to privacy, this research suggests that the meta consent model as embraced by the healthcare sector emerges as a feasible solution. This model allows data subjects to determine their consent preferences i.e., how and when they wish their consent to be sought for further collection and use, at the point of primary collection of personal data. Additionally, this paper recommends that the model should be embraced by the judiciary in its adjudication of matters and finally, that an amendment incorporating the solution should be made
    corecore