701 research outputs found
Differential Private Data Collection and Analysis Based on Randomized Multiple Dummies for Untrusted Mobile Crowdsensing
Mobile crowdsensing, which collects environmental information from mobile phone users, is growing in popularity. These data can be used by companies for marketing surveys or decision making. However, collecting sensing data from other users may violate their privacy. Moreover, the data aggregator and/or the participants of crowdsensing may be untrusted entities. Recent studies have proposed randomized response schemes for anonymized data collection. This kind of data collection can analyze the sensing data of users statistically without precise information about other users\u27 sensing results. However, traditional randomized response schemes and their extensions require a large number of samples to achieve proper estimation. In this paper, we propose a new anonymized data-collection scheme that can estimate data distributions more accurately. Using simulations with synthetic and real datasets, we prove that our proposed method can reduce the mean squared error and the JS divergence by more than 85% as compared with other existing studies
Design and evaluation of a privacy architecture for crowdsensing applications
By using consumer devices such as cellphones, wearables and Internet of Things devices owned by citizens, crowdsensing systems are providing solutions to the community in areas such as transportation, security, entertainment and the environment through the collection of various types of sensor data. Privacy is a major issue in these systems because the data collected can potentially reveal aspects considered private by the contributors of data. We propose the Privacy-Enabled ARchitecture (PEAR), a layered architecture aimed at protecting privacy in privacy-aware crowdsensing systems. We identify and describe the layers of the architecture. We propose and evaluate the design of MetroTrack, a crowdsensing system that is based on the proposed PEAR architecture
Trust and obfuscation principles for quality of information in emerging pervasive environments
Non peer reviewedPostprin
Obfuscation and anonymization methods for locational privacy protection : a systematic literature review
Dissertation submitted in partial fulfilment of the requirements for the Degree of Master of Science in Geospatial TechnologiesThe mobile technology development combined with the business model of a majority
of application companies is posing a potential risk to individualsâ privacy.
Because the industry default practice is unrestricted data collection. Although,
the data collection has virtuous usage in improve services and procedures; it also
undermines userâs privacy. For that reason is crucial to learn what is the privacy
protection mechanism state-of-art.
Privacy protection can be pursued by passing new regulation and developing
preserving mechanism. Understanding in what extent the current technology is
capable to protect devices or systems is important to drive the advancements
in the privacy preserving field, addressing the limits and challenges to deploy
mechanism with a reasonable quality of Service-QoS level.
This research aims to display and discuss the current privacy preserving
schemes, its capabilities, limitations and challenges
Privacy-enhancing Aggregation of Internet of Things Data via Sensors Grouping
Big data collection practices using Internet of Things (IoT) pervasive
technologies are often privacy-intrusive and result in surveillance, profiling,
and discriminatory actions over citizens that in turn undermine the
participation of citizens to the development of sustainable smart cities.
Nevertheless, real-time data analytics and aggregate information from IoT
devices open up tremendous opportunities for managing smart city
infrastructures. The privacy-enhancing aggregation of distributed sensor data,
such as residential energy consumption or traffic information, is the research
focus of this paper. Citizens have the option to choose their privacy level by
reducing the quality of the shared data at a cost of a lower accuracy in data
analytics services. A baseline scenario is considered in which IoT sensor data
are shared directly with an untrustworthy central aggregator. A grouping
mechanism is introduced that improves privacy by sharing data aggregated first
at a group level compared as opposed to sharing data directly to the central
aggregator. Group-level aggregation obfuscates sensor data of individuals, in a
similar fashion as differential privacy and homomorphic encryption schemes,
thus inference of privacy-sensitive information from single sensors becomes
computationally harder compared to the baseline scenario. The proposed system
is evaluated using real-world data from two smart city pilot projects. Privacy
under grouping increases, while preserving the accuracy of the baseline
scenario. Intra-group influences of privacy by one group member on the other
ones are measured and fairness on privacy is found to be maximized between
group members with similar privacy choices. Several grouping strategies are
compared. Grouping by proximity of privacy choices provides the highest privacy
gains. The implications of the strategy on the design of incentives mechanisms
are discussed
- âŠ