767 research outputs found

    Privacy metrics and boundaries

    Get PDF
    This paper aims at defining a set of privacy metrics (quantitative and qualitative) in the case of the relation between a privacy protector ,and an information gatherer .The aims with such metrics are : -to allow to assess and compare different user scenarios and their differences ;for examples of scenarios see [4]; -to define a notion of privacy boundary, and design it to encompass the set of information , behaviours , actions and processes which the privacy protector can accept to expose to an information gathering under an agreement with said party ; everything outside the boundary is not acceptable and justifies not entering into the agreement ; -to characterize the contribution of privacy enhancing technologies (PET). A full case is given with the qualitative and quantitative privacy metrics determination and envelope, i.e. a Cisco Inc. privacy agreement.Privacy; Metrics; Set theory; Economics; Privacy enhancing technologies

    Technical Privacy Metrics: a Systematic Survey

    Get PDF
    The file attached to this record is the author's final peer reviewed versionThe goal of privacy metrics is to measure the degree of privacy enjoyed by users in a system and the amount of protection offered by privacy-enhancing technologies. In this way, privacy metrics contribute to improving user privacy in the digital world. The diversity and complexity of privacy metrics in the literature makes an informed choice of metrics challenging. As a result, instead of using existing metrics, new metrics are proposed frequently, and privacy studies are often incomparable. In this survey we alleviate these problems by structuring the landscape of privacy metrics. To this end, we explain and discuss a selection of over eighty privacy metrics and introduce categorizations based on the aspect of privacy they measure, their required inputs, and the type of data that needs protection. In addition, we present a method on how to choose privacy metrics based on nine questions that help identify the right privacy metrics for a given scenario, and highlight topics where additional work on privacy metrics is needed. Our survey spans multiple privacy domains and can be understood as a general framework for privacy measurement

    Measuring Privacy in Vehicular Networks

    Get PDF
    Vehicular communication plays a key role in near- future automotive transport, promising features like increased traffic safety or wireless software updates. However, vehicular communication can expose driver locations and thus poses important privacy risks. Many schemes have been proposed to protect privacy in vehicular communication, and their effectiveness is usually shown using privacy metrics. However, to the best of our knowledge, (1) different privacy metrics have never been compared to each other, and (2) it is unknown how strong the metrics are. In this paper, we argue that privacy metrics should be monotonic, i.e. that they indicate decreasing privacy for increasing adversary strength, and we evaluate the monotonicity of 32 privacy metrics on real and synthetic traffic with state-of- the-art adversary models. Our results indicate that most privacy metrics are weak at least in some situations. We therefore recommend to use metrics suites, i.e. combinations of privacy metrics, when evaluating new privacy-enhancing technologies

    Using Metrics Suites to Improve the Measurement of Privacy in Graphs

    Get PDF
    The file attached to this record is the author's final peer reviewed version. The Publisher's final version can be found by following the DOI link.Social graphs are widely used in research (e.g., epidemiology) and business (e.g., recommender systems). However, sharing these graphs poses privacy risks because they contain sensitive information about individuals. Graph anonymization techniques aim to protect individual users in a graph, while graph de-anonymization aims to re-identify users. The effectiveness of anonymization and de-anonymization algorithms is usually evaluated with privacy metrics. However, it is unclear how strong existing privacy metrics are when they are used in graph privacy. In this paper, we study 26 privacy metrics for graph anonymization and de-anonymization and evaluate their strength in terms of three criteria: monotonicity indicates whether the metric indicates lower privacy for stronger adversaries; for within-scenario comparisons, evenness indicates whether metric values are spread evenly; and for between-scenario comparisons, shared value range indicates whether metrics use a consistent value range across scenarios. Our extensive experiments indicate that no single metric fulfills all three criteria perfectly. We therefore use methods from multi-criteria decision analysis to aggregate multiple metrics in a metrics suite, and we show that these metrics suites improve monotonicity compared to the best individual metric. This important result enables more monotonic, and thus more accurate, evaluations of new graph anonymization and de-anonymization algorithms

    Privacy Metrics and Boundaries

    Get PDF
    This paper aims at defining a set of privacy metrics (quantitative and qualitative) in the case of the relation between a privacy protector ,and an information gatherer .The aims with such metrics are: -to allow to assess and compare different user scenarios and their differences; for examples of scenarios see [4]; -to define a notion of privacy boundary, and design it to encompass the set of information, behaviours, actions and processes which the privacy protector can accept to expose to an information gathering under an agreement with said party; everything outside the boundary is not acceptable and justifies not entering into the agreement; -to characterize the contribution of privacy enhancing technologies (PET). A full case is given with the qualitative and quantitative privacy metrics determination and envelope, i.e. a Cisco Inc. privacy agreement

    Privacy metrics and boundaries

    Get PDF
    This paper aims at defining a set of privacy metrics (quantitative and qualitative) in the case of the relation between a privacy protector ,and an information gatherer .The aims with such metrics are : -to allow to assess and compare different user scenarios and their differences ;for examples of scenarios see [4]; -to define a notion of privacy boundary, and design it to encompass the set of information , behaviours , actions and processes which the privacy protector can accept to expose to an information gathering under an agreement with said party ; everything outside the boundary is not acceptable and justifies not entering into the agreement ; -to characterize the contribution of privacy enhancing technologies (PET). A full case is given with the qualitative and quantitative privacy metrics determination and envelope, i.e. a Cisco Inc. privacy agreement

    Taxonomy for Information Privacy Metrics

    Full text link
    A comprehensive privacy framework is essential for the progress of the information privacy field. Some practical implications of a comprehensive framework are laying foundation for building information privacy metrics and having fruitful discussions. Taxonomy is an essential step in building a framework. This research study attempts to build taxonomy for the information privacy domain based on empirical data. The classical grounded theory approach introduced by Glaser was applied and incidents reported by the International Association of Privacy Professionals (IAPP) are used for building the taxonomy. These incidents include privacy related current research works, data breaches, personal views, interviews, and technological innovations. TAMZAnalyzer, an open source qualitative data analysis tool, was used in coding, keeping memos, sorting, and creating categories. The taxonomy is presented in seven themes and several categories including legal, technical, and ethical aspects. The findings of this study helps practitioners understand and discuss the subjects and academia work toward building a comprehensive framework and metrics for the information privacy domain

    On the Measurement of Privacy as an Attacker's Estimation Error

    Get PDF
    A wide variety of privacy metrics have been proposed in the literature to evaluate the level of protection offered by privacy enhancing-technologies. Most of these metrics are specific to concrete systems and adversarial models, and are difficult to generalize or translate to other contexts. Furthermore, a better understanding of the relationships between the different privacy metrics is needed to enable more grounded and systematic approach to measuring privacy, as well as to assist systems designers in selecting the most appropriate metric for a given application. In this work we propose a theoretical framework for privacy-preserving systems, endowed with a general definition of privacy in terms of the estimation error incurred by an attacker who aims to disclose the private information that the system is designed to conceal. We show that our framework permits interpreting and comparing a number of well-known metrics under a common perspective. The arguments behind these interpretations are based on fundamental results related to the theories of information, probability and Bayes decision.Comment: This paper has 18 pages and 17 figure

    Developing and Testing Visual Privacy Metrics

    Get PDF
    The dense redevelopment of inner cities (intensification) has been accompanied by a dramatic surge in the development of multi-unit residential buildings (MURBs) within ever shrinking proximities to one another. Modern multi-unit residential building design often embodies conflicting desires for daylighting and visual privacy, or designers simply do not consider collective occupant discomfort factors. Thus, the focus of this project was to develop and validate conceptual and quantitative variables influencing visual privacy, such that future and existing residential designs can be analyzed from a visual privacy perspective. This paper formulates an approach that combines building physics (visual angles and relative brightness) with social and psychological factors to avoid conflicts between competing aspirations for sustainable and resilient buildings that promote occupant wellbeing

    On the Relationship Between Inference and Data Privacy in Decentralized IoT Networks

    Full text link
    In a decentralized Internet of Things (IoT) network, a fusion center receives information from multiple sensors to infer a public hypothesis of interest. To prevent the fusion center from abusing the sensor information, each sensor sanitizes its local observation using a local privacy mapping, which is designed to achieve both inference privacy of a private hypothesis and data privacy of the sensor raw observations. Various inference and data privacy metrics have been proposed in the literature. We introduce the concepts of privacy implication and non-guarantee to study the relationships between these privacy metrics. We propose an optimization framework in which both local differential privacy (data privacy) and information privacy (inference privacy) metrics are incorporated. In the parametric case where sensor observations' distributions are known \emph{a priori}, we propose a two-stage local privacy mapping at each sensor, and show that such an architecture is able to achieve information privacy and local differential privacy to within the predefined budgets. For the nonparametric case where sensor distributions are unknown, we adopt an empirical optimization approach. Simulation and experiment results demonstrate that our proposed approaches allow the fusion center to accurately infer the public hypothesis while protecting both inference and data privacy
    • …
    corecore