1,585 research outputs found

    Location Privacy in Spatial Crowdsourcing

    Full text link
    Spatial crowdsourcing (SC) is a new platform that engages individuals in collecting and analyzing environmental, social and other spatiotemporal information. With SC, requesters outsource their spatiotemporal tasks to a set of workers, who will perform the tasks by physically traveling to the tasks' locations. This chapter identifies privacy threats toward both workers and requesters during the two main phases of spatial crowdsourcing, tasking and reporting. Tasking is the process of identifying which tasks should be assigned to which workers. This process is handled by a spatial crowdsourcing server (SC-server). The latter phase is reporting, in which workers travel to the tasks' locations, complete the tasks and upload their reports to the SC-server. The challenge is to enable effective and efficient tasking as well as reporting in SC without disclosing the actual locations of workers (at least until they agree to perform a task) and the tasks themselves (at least to workers who are not assigned to those tasks). This chapter aims to provide an overview of the state-of-the-art in protecting users' location privacy in spatial crowdsourcing. We provide a comparative study of a diverse set of solutions in terms of task publishing modes (push vs. pull), problem focuses (tasking and reporting), threats (server, requester and worker), and underlying technical approaches (from pseudonymity, cloaking, and perturbation to exchange-based and encryption-based techniques). The strengths and drawbacks of the techniques are highlighted, leading to a discussion of open problems and future work

    Time Distortion Anonymization for the Publication of Mobility Data with High Utility

    Get PDF
    An increasing amount of mobility data is being collected every day by different means, such as mobile applications or crowd-sensing campaigns. This data is sometimes published after the application of simple anonymization techniques (e.g., putting an identifier instead of the users' names), which might lead to severe threats to the privacy of the participating users. Literature contains more sophisticated anonymization techniques, often based on adding noise to the spatial data. However, these techniques either compromise the privacy if the added noise is too little or the utility of the data if the added noise is too strong. We investigate in this paper an alternative solution, which builds on time distortion instead of spatial distortion. Specifically, our contribution lies in (1) the introduction of the concept of time distortion to anonymize mobility datasets (2) Promesse, a protection mechanism implementing this concept (3) a practical study of Promesse compared to two representative spatial distortion mechanisms, namely Wait For Me, which enforces k-anonymity, and Geo-Indistinguishability, which enforces differential privacy. We evaluate our mechanism practically using three real-life datasets. Our results show that time distortion reduces the number of points of interest that can be retrieved by an adversary to under 3 %, while the introduced spatial error is almost null and the distortion introduced on the results of range queries is kept under 13 % on average.Comment: in 14th IEEE International Conference on Trust, Security and Privacy in Computing and Communications, Aug 2015, Helsinki, Finlan

    Empirical Analysis of Privacy Preservation Models for Cyber Physical Deployments from a Pragmatic Perspective

    Get PDF
    The difficulty of privacy protection in cyber-physical installations encompasses several sectors and calls for methods like encryption, hashing, secure routing, obfuscation, and data exchange, among others. To create a privacy preservation model for cyber physical deployments, it is advised that data privacy, location privacy, temporal privacy, node privacy, route privacy, and other types of privacy be taken into account. Consideration must also be given to other types of privacy, such as temporal privacy. The computationally challenging process of incorporating these models into any wireless network also affects quality of service (QoS) variables including end-to-end latency, throughput, energy use, and packet delivery ratio. The best privacy models must be used by network designers and should have the least negative influence on these quality-of-service characteristics. The designers used common privacy models for the goal of protecting cyber-physical infrastructure in order to achieve this. The limitations of these installations' interconnection and interface-ability are not taken into account in this. As a result, even while network security has increased, the network's overall quality of service has dropped. The many state-of-the-art methods for preserving privacy in cyber-physical deployments without compromising their performance in terms of quality of service are examined and analyzed in this research. Lowering the likelihood that such circumstances might arise is the aim of this investigation and review. These models are rated according to how much privacy they provide, how long it takes from start to finish to transfer data, how much energy they use, and how fast their networks are. In order to maximize privacy while maintaining a high degree of service performance, the comparison will assist network designers and researchers in selecting the optimal models for their particular deployments. Additionally, the author of this book offers a variety of tactics that, when used together, might improve each reader's performance. This study also provides a range of tried-and-true machine learning approaches that networks may take into account and examine in order to enhance their privacy performance

    Differential Privacy for Industrial Internet of Things: Opportunities, Applications and Challenges

    Get PDF
    The development of Internet of Things (IoT) brings new changes to various fields. Particularly, industrial Internet of Things (IIoT) is promoting a new round of industrial revolution. With more applications of IIoT, privacy protection issues are emerging. Specially, some common algorithms in IIoT technology such as deep models strongly rely on data collection, which leads to the risk of privacy disclosure. Recently, differential privacy has been used to protect user-terminal privacy in IIoT, so it is necessary to make in-depth research on this topic. In this paper, we conduct a comprehensive survey on the opportunities, applications and challenges of differential privacy in IIoT. We firstly review related papers on IIoT and privacy protection, respectively. Then we focus on the metrics of industrial data privacy, and analyze the contradiction between data utilization for deep models and individual privacy protection. Several valuable problems are summarized and new research ideas are put forward. In conclusion, this survey is dedicated to complete comprehensive summary and lay foundation for the follow-up researches on industrial differential privacy
    • …
    corecore