29,448 research outputs found

    Towards trajectory anonymization: a generalization-based approach

    Get PDF
    Trajectory datasets are becoming popular due to the massive usage of GPS and locationbased services. In this paper, we address privacy issues regarding the identification of individuals in static trajectory datasets. We first adopt the notion of k-anonymity to trajectories and propose a novel generalization-based approach for anonymization of trajectories. We further show that releasing anonymized trajectories may still have some privacy leaks. Therefore we propose a randomization based reconstruction algorithm for releasing anonymized trajectory data and also present how the underlying techniques can be adapted to other anonymity standards. The experimental results on real and synthetic trajectory datasets show the effectiveness of the proposed techniques

    User-centric Privacy Engineering for the Internet of Things

    Get PDF
    User privacy concerns are widely regarded as a key obstacle to the success of modern smart cyber-physical systems. In this paper, we analyse, through an example, some of the requirements that future data collection architectures of these systems should implement to provide effective privacy protection for users. Then, we give an example of how these requirements can be implemented in a smart home scenario. Our example architecture allows the user to balance the privacy risks with the potential benefits and take a practical decision determining the extent of the sharing. Based on this example architecture, we identify a number of challenges that must be addressed by future data processing systems in order to achieve effective privacy management for smart cyber-physical systems.Comment: 12 Page

    On the Measurement of Privacy as an Attacker's Estimation Error

    Get PDF
    A wide variety of privacy metrics have been proposed in the literature to evaluate the level of protection offered by privacy enhancing-technologies. Most of these metrics are specific to concrete systems and adversarial models, and are difficult to generalize or translate to other contexts. Furthermore, a better understanding of the relationships between the different privacy metrics is needed to enable more grounded and systematic approach to measuring privacy, as well as to assist systems designers in selecting the most appropriate metric for a given application. In this work we propose a theoretical framework for privacy-preserving systems, endowed with a general definition of privacy in terms of the estimation error incurred by an attacker who aims to disclose the private information that the system is designed to conceal. We show that our framework permits interpreting and comparing a number of well-known metrics under a common perspective. The arguments behind these interpretations are based on fundamental results related to the theories of information, probability and Bayes decision.Comment: This paper has 18 pages and 17 figure

    Potential mass surveillance and privacy violations in proximity-based social applications

    Get PDF
    Proximity-based social applications let users interact with people that are currently close to them, by revealing some information about their preferences and whereabouts. This information is acquired through passive geo-localisation and used to build a sense of serendipitous discovery of people, places and interests. Unfortunately, while this class of applications opens different interactions possibilities for people in urban settings, obtaining access to certain identity information could lead a possible privacy attacker to identify and follow a user in their movements in a specific period of time. The same information shared through the platform could also help an attacker to link the victim's online profiles to physical identities. We analyse a set of popular dating application that shares users relative distances within a certain radius and show how, by using the information shared on these platforms, it is possible to formalise a multilateration attack, able to identify the user actual position. The same attack can also be used to follow a user in all their movements within a certain period of time, therefore identifying their habits and Points of Interest across the city. Furthermore we introduce a social attack which uses common Facebook likes to profile a person and finally identify their real identity

    On the anonymity risk of time-varying user profiles.

    Get PDF
    Websites and applications use personalisation services to profile their users, collect their patterns and activities and eventually use this data to provide tailored suggestions. User preferences and social interactions are therefore aggregated and analysed. Every time a user publishes a new post or creates a link with another entity, either another user, or some online resource, new information is added to the user profile. Exposing private data does not only reveal information about single users’ preferences, increasing their privacy risk, but can expose more about their network that single actors intended. This mechanism is self-evident in social networks where users receive suggestions based on their friends’ activities. We propose an information-theoretic approach to measure the differential update of the anonymity risk of time-varying user profiles. This expresses how privacy is affected when new content is posted and how much third-party services get to know about the users when a new activity is shared. We use actual Facebook data to show how our model can be applied to a real-world scenario.Peer ReviewedPostprint (published version
    • …
    corecore