2,385 research outputs found

    A probabilistic inference attack on suppressed social networks

    Get PDF
    Social Networks (SNs) are now widely used by modern time internet users to share any personal information. Such networks are so rich in information content that there is public and commercial benefit in sharing them with other third parties. However, information stored in SNs are mostly person specific and subject to privacy concerns. One way to address the privacy issues is to give the control of the data to the users enabling them to suppress data that they choose not to share with third parties. Unfortunately, above mentioned preference-based suppression techniques are not sufficient to protect privacy mainly because they do not allow users to control data about other users they are linked with Information about neighbors becomes an inference channel in an SN when there is known correlation between the existence of a link between two users and the users having the same sensitive information. In this thesis, we propose a probabilistic inference attack on a suppressed social network data, that can successfully predict a suppressed label by looking at neighboring users' data. The attack algorithm is designed for a realistic adversary that knows, from background or external sources, the correlations between labels and links in the SN. We experimentally show that it is possible to recover majority of the suppressed labels of users even in a highly suppressed SN

    A survey on privacy in human mobility

    Get PDF
    In the last years we have witnessed a pervasive use of location-aware technologies such as vehicular GPS-enabled devices, RFID based tools, mobile phones, etc which generate collection and storing of a large amount of human mobility data. The powerful of this data has been recognized by both the scientific community and the industrial worlds. Human mobility data can be used for different scopes such as urban traffic management, urban planning, urban pollution estimation, etc. Unfortunately, data describing human mobility is sensitive, because people's whereabouts may allow re-identification of individuals in a de-identified database and the access to the places visited by indi-viduals may enable the inference of sensitive information such as religious belief, sexual preferences, health conditions, and so on. The literature reports many approaches aimed at overcoming privacy issues in mobility data, thus in this survey we discuss the advancements on privacy-preserving mo-bility data publishing. We first describe the adversarial attack and privacy models typically taken into consideration for mobility data, then we present frameworks for the privacy risk assessment and finally, we discuss three main categories of privacy-preserving strategies: methods based on anonymization of mobility data, methods based on the differential privacy models and methods which protect privacy by exploiting generative models for synthetic trajectory generation

    Privacy and Transparency in Graph Machine Learning: A Unified Perspective

    Full text link
    Graph Machine Learning (GraphML), whereby classical machine learning is generalized to irregular graph domains, has enjoyed a recent renaissance, leading to a dizzying array of models and their applications in several domains. With its growing applicability to sensitive domains and regulations by government agencies for trustworthy AI systems, researchers have started looking into the issues of transparency and privacy of graph learning. However, these topics have been mainly investigated independently. In this position paper, we provide a unified perspective on the interplay of privacy and transparency in GraphML

    Mathematical techniques for the protection of patient's privacy in medical databases

    Get PDF
    In modern society, keeping the balance between privacy and public access to information is becoming a widespread problem more and more often. Valid data is crucial for many kinds of research, but the public good should not be achieved at the expense of individuals. While creating a central database of patients, the CSIOZ wishes to provide statistical information for selected institutions. However, there are some plans to extend the access by providing the statistics to researchers or even to citizens. This might pose a significant risk of disclosure of some private, sensitive information about individuals. This report proposes some methods to prevent data leaks. One category of suggestions is based on the idea of modifying statistics, so that they would maintain importance for statisticians and at the same time guarantee the protection of patient's privacy. Another group of proposed mechanisms, though sometimes difficult to implement, enables one to obtain precise statistics, while restricting such queries which might reveal sensitive information

    PRIVAS - automatic anonymization of databases

    Get PDF
    Currently, given the technological evolution, data and information are increasingly valuable in the most diverse areas for the most various purposes. Although the information and knowledge discovered by the exploration and use of data can be very valuable in many applications, people have been increasingly concerned about the other side, that is, the privacy threats that these processes bring. The system Privas, described in this paper, will aid the Data Publisher to pre-process the database before publishing. For that, a DSL is used to define the database schema description, identify the sensitive data and the desired privacy level. After that a Privas processor will process the DSL program and interpret it to automatically transform the repository schema. The automatization of the anonymization process is the main contribution and novelty of this work.info:eu-repo/semantics/publishedVersio

    Privacy in trajectory micro-data publishing : a survey

    Get PDF
    We survey the literature on the privacy of trajectory micro-data, i.e., spatiotemporal information about the mobility of individuals, whose collection is becoming increasingly simple and frequent thanks to emerging information and communication technologies. The focus of our review is on privacy-preserving data publishing (PPDP), i.e., the publication of databases of trajectory micro-data that preserve the privacy of the monitored individuals. We classify and present the literature of attacks against trajectory micro-data, as well as solutions proposed to date for protecting databases from such attacks. This paper serves as an introductory reading on a critical subject in an era of growing awareness about privacy risks connected to digital services, and provides insights into open problems and future directions for research.Comment: Accepted for publication at Transactions for Data Privac
    • …
    corecore