125,954 research outputs found

    Systematizing Genome Privacy Research: A Privacy-Enhancing Technologies Perspective

    Full text link
    Rapid advances in human genomics are enabling researchers to gain a better understanding of the role of the genome in our health and well-being, stimulating hope for more effective and cost efficient healthcare. However, this also prompts a number of security and privacy concerns stemming from the distinctive characteristics of genomic data. To address them, a new research community has emerged and produced a large number of publications and initiatives. In this paper, we rely on a structured methodology to contextualize and provide a critical analysis of the current knowledge on privacy-enhancing technologies used for testing, storing, and sharing genomic data, using a representative sample of the work published in the past decade. We identify and discuss limitations, technical challenges, and issues faced by the community, focusing in particular on those that are inherently tied to the nature of the problem and are harder for the community alone to address. Finally, we report on the importance and difficulty of the identified challenges based on an online survey of genome data privacy expertsComment: To appear in the Proceedings on Privacy Enhancing Technologies (PoPETs), Vol. 2019, Issue

    Privacy metrics and boundaries

    Get PDF
    This paper aims at defining a set of privacy metrics (quantitative and qualitative) in the case of the relation between a privacy protector ,and an information gatherer .The aims with such metrics are : -to allow to assess and compare different user scenarios and their differences ;for examples of scenarios see [4]; -to define a notion of privacy boundary, and design it to encompass the set of information , behaviours , actions and processes which the privacy protector can accept to expose to an information gathering under an agreement with said party ; everything outside the boundary is not acceptable and justifies not entering into the agreement ; -to characterize the contribution of privacy enhancing technologies (PET). A full case is given with the qualitative and quantitative privacy metrics determination and envelope, i.e. a Cisco Inc. privacy agreement.Privacy; Metrics; Set theory; Economics; Privacy enhancing technologies

    Generating Artificial Data for Private Deep Learning

    Full text link
    In this paper, we propose generating artificial data that retain statistical properties of real data as the means of providing privacy with respect to the original dataset. We use generative adversarial network to draw privacy-preserving artificial data samples and derive an empirical method to assess the risk of information disclosure in a differential-privacy-like way. Our experiments show that we are able to generate artificial data of high quality and successfully train and validate machine learning models on this data while limiting potential privacy loss.Comment: Privacy-Enhancing Artificial Intelligence and Language Technologies, AAAI Spring Symposium Series, 201
    corecore