3,953 research outputs found
Privacy in trajectory micro-data publishing : a survey
We survey the literature on the privacy of trajectory micro-data, i.e.,
spatiotemporal information about the mobility of individuals, whose collection
is becoming increasingly simple and frequent thanks to emerging information and
communication technologies. The focus of our review is on privacy-preserving
data publishing (PPDP), i.e., the publication of databases of trajectory
micro-data that preserve the privacy of the monitored individuals. We classify
and present the literature of attacks against trajectory micro-data, as well as
solutions proposed to date for protecting databases from such attacks. This
paper serves as an introductory reading on a critical subject in an era of
growing awareness about privacy risks connected to digital services, and
provides insights into open problems and future directions for research.Comment: Accepted for publication at Transactions for Data Privac
Towards Mobility Data Science (Vision Paper)
Mobility data captures the locations of moving objects such as humans,
animals, and cars. With the availability of GPS-equipped mobile devices and
other inexpensive location-tracking technologies, mobility data is collected
ubiquitously. In recent years, the use of mobility data has demonstrated
significant impact in various domains including traffic management, urban
planning, and health sciences. In this paper, we present the emerging domain of
mobility data science. Towards a unified approach to mobility data science, we
envision a pipeline having the following components: mobility data collection,
cleaning, analysis, management, and privacy. For each of these components, we
explain how mobility data science differs from general data science, we survey
the current state of the art and describe open challenges for the research
community in the coming years.Comment: Updated arXiv metadata to include two authors that were missing from
the metadata. PDF has not been change
Obfuscation and anonymization methods for locational privacy protection : a systematic literature review
Dissertation submitted in partial fulfilment of the requirements for the Degree of Master of Science in Geospatial TechnologiesThe mobile technology development combined with the business model of a majority
of application companies is posing a potential risk to individualsâ privacy.
Because the industry default practice is unrestricted data collection. Although,
the data collection has virtuous usage in improve services and procedures; it also
undermines userâs privacy. For that reason is crucial to learn what is the privacy
protection mechanism state-of-art.
Privacy protection can be pursued by passing new regulation and developing
preserving mechanism. Understanding in what extent the current technology is
capable to protect devices or systems is important to drive the advancements
in the privacy preserving field, addressing the limits and challenges to deploy
mechanism with a reasonable quality of Service-QoS level.
This research aims to display and discuss the current privacy preserving
schemes, its capabilities, limitations and challenges
A survey on privacy in human mobility
In the last years we have witnessed a pervasive use of location-aware technologies such as vehicular GPS-enabled devices, RFID based tools, mobile phones, etc which generate collection and storing of a large amount of human mobility data. The powerful of this data has been recognized by both the scientific community and the industrial worlds. Human mobility data can be used for different scopes such as urban traffic management, urban planning, urban pollution estimation, etc. Unfortunately, data describing human mobility is sensitive, because people's whereabouts may allow re-identification of individuals in a de-identified database and the access to the places visited by indi-viduals may enable the inference of sensitive information such as religious belief, sexual preferences, health conditions, and so on. The literature reports many approaches aimed at overcoming privacy issues in mobility data, thus in this survey we discuss the advancements on privacy-preserving mo-bility data publishing. We first describe the adversarial attack and privacy models typically taken into consideration for mobility data, then we present frameworks for the privacy risk assessment and finally, we discuss three main categories of privacy-preserving strategies: methods based on anonymization of mobility data, methods based on the differential privacy models and methods which protect privacy by exploiting generative models for synthetic trajectory generation
A Survey on Privacy in Human Mobility
In the last years we have witnessed a pervasive use of location-aware technologies such as vehicular GPS-enabled devices, RFID based tools, mobile phones, etc which generate collection and storing of a large amount of human mobility data. The powerful of this data has been recognized by both the scientific community and the industrial worlds. Human mobility data can be used for different scopes such as urban traffic management, urban planning, urban pollution estimation, etc. Unfortunately, data describing human mobility is sensitive, because peopleâs whereabouts may allow re-identification of individuals in a de-identified database and the access to the places visited by individuals may enable the inference of sensitive information such as religious belief, sexual preferences, health conditions, and so on. The literature reports many approaches aimed at overcoming privacy issues in mobility data, thus in this survey we discuss the advancements on privacy-preserving mobility data publishing. We first describe the adversarial attack and privacy models typically taken into consideration for mobility data, then we present frameworks for the privacy risk assessment and finally, we discuss three main categories of privacy-preserving strategies: methods based on anonymization of mobility data, methods based on the differential privacy models and methods which protect privacy by exploiting generative models for synthetic trajectory generation
Protecting privacy of semantic trajectory
The growing ubiquity of GPS-enabled devices in everyday life has made large-scale collection of trajectories feasible, providing ever-growing opportunities for human movement analysis. However, publishing this vulnerable data is accompanied by increasing concerns about individualsâ geoprivacy. This thesis has two objectives: (1) propose a privacy protection framework for semantic trajectories and (2) develop a Python toolbox in ArcGIS Pro environment for non-expert users to enable them to anonymize trajectory data. The former aims to prevent usersâ re-identification when knowing the important locations or any random spatiotemporal points of users by swapping their important locations to new locations with the same semantics and unlinking the users from their trajectories. This is accomplished by converting GPS points into sequences of visited meaningful locations and moves and integrating several anonymization techniques. The second component of this thesis implements privacy protection in a way that even users without deep knowledge of anonymization and coding skills can anonymize their data by offering an all-in-one toolbox. By proposing and implementing this framework and toolbox, we hope that trajectory privacy is better protected in research
SoK: differentially private publication of trajectory data
Trajectory analysis holds many promises, from improvements in traffic management to routing advice or infrastructure development. However, learning usersâ paths is extremely privacy-invasive. Therefore, there is a necessity to protect trajectories such that we preserve the global properties, useful for analysis, while specific and private information of individuals remains inaccessible. Trajectories, however, are difficult to protect, since they are sequential, highly dimensional, correlated, bound to geophysical restrictions, and easily mapped to semantic points of interest. This paper aims to establish a systematic framework on protective masking measures for trajectory databases with differentially private (DP) guarantees, including also utility properties, derived from ideas and limitations of existing proposals. To reach this goal, we systematize the utility metrics used throughout the literature, deeply analyze the DP granularity notions, explore and elaborate on the state of the art on privacy-enhancing mechanisms and their problems, and expose the main limitations of DP notions in the context of trajectories.We would like to thank the reviewers and shepherd for their useful comments and suggestions in the improvement of this paper. Javier Parra-Arnau is the recipient of a âRamĂłn y Cajalâ fellowship funded by the Spanish Ministry of Science and Innovation. This work also received support from âla Caixaâ Foundation (fellowship code LCF/BQ/PR20/11770009), the European Unionâs H2020 program (Marie SkĆodowska-Curie grant agreement â 847648) from the Government of Spain under the project âCOMPROMISEâ (PID2020-113795RB-C31/AEI/10.13039/501100011033), and from the BMBF project âPROPOLISâ (16KIS1393K). The authors at KIT are supported by KASTEL Security Research Labs (Topic 46.23 of the Helmholtz Association) and Germanyâs Excellence Strategy (EXC 2050/1 âCeTIâ; ID 390696704).Peer ReviewedPostprint (published version
SoK: Differentially Private Publication of Trajectory Data
Trajectory analysis holds many promises, from improvements in traffic management to routing advice or infrastructure development. However, learning users\u27 paths is extremely privacy-invasive. Therefore, there is a necessity to protect trajectories such that we preserve the global properties, useful for analysis, while specific and private information of individuals remains inaccessible. Trajectories, however, are difficult to protect, since they are sequential, highly dimensional, correlated, bound to geophysical restrictions, and easily mapped to semantic points of interest.
This paper aims to establish a systematic framework on protective masking and synthetic-generation measures for trajectory databases with syntactic and differentially private (DP) guarantees, including also utility properties, derived from ideas and limitations of existing proposals. To reach this goal, we systematize the utility metrics used throughout the literature, deeply analyze the DP granularity notions, explore and elaborate on the state of the art on privacy-enhancing mechanisms and their problems, and expose the main limitations of DP notions in the context of trajectories
- âŠ