3,223 research outputs found

    Clustering with diversity

    Full text link
    We consider the {\em clustering with diversity} problem: given a set of colored points in a metric space, partition them into clusters such that each cluster has at least â„“\ell points, all of which have distinct colors. We give a 2-approximation to this problem for any â„“\ell when the objective is to minimize the maximum radius of any cluster. We show that the approximation ratio is optimal unless P=NP\mathbf{P=NP}, by providing a matching lower bound. Several extensions to our algorithm have also been developed for handling outliers. This problem is mainly motivated by applications in privacy-preserving data publication.Comment: Extended abstract accepted in ICALP 2010. Keywords: Approximation algorithm, k-center, k-anonymity, l-diversit

    Privacy-enhancing Aggregation of Internet of Things Data via Sensors Grouping

    Full text link
    Big data collection practices using Internet of Things (IoT) pervasive technologies are often privacy-intrusive and result in surveillance, profiling, and discriminatory actions over citizens that in turn undermine the participation of citizens to the development of sustainable smart cities. Nevertheless, real-time data analytics and aggregate information from IoT devices open up tremendous opportunities for managing smart city infrastructures. The privacy-enhancing aggregation of distributed sensor data, such as residential energy consumption or traffic information, is the research focus of this paper. Citizens have the option to choose their privacy level by reducing the quality of the shared data at a cost of a lower accuracy in data analytics services. A baseline scenario is considered in which IoT sensor data are shared directly with an untrustworthy central aggregator. A grouping mechanism is introduced that improves privacy by sharing data aggregated first at a group level compared as opposed to sharing data directly to the central aggregator. Group-level aggregation obfuscates sensor data of individuals, in a similar fashion as differential privacy and homomorphic encryption schemes, thus inference of privacy-sensitive information from single sensors becomes computationally harder compared to the baseline scenario. The proposed system is evaluated using real-world data from two smart city pilot projects. Privacy under grouping increases, while preserving the accuracy of the baseline scenario. Intra-group influences of privacy by one group member on the other ones are measured and fairness on privacy is found to be maximized between group members with similar privacy choices. Several grouping strategies are compared. Grouping by proximity of privacy choices provides the highest privacy gains. The implications of the strategy on the design of incentives mechanisms are discussed

    Location Privacy in the Era of Big Data and Machine Learning

    Get PDF
    Location data of individuals is one of the most sensitive sources of information that once revealed to ill-intended individuals or service providers, can cause severe privacy concerns. In this thesis, we aim at preserving the privacy of users in telecommunication networks against untrusted service providers as well as improving their privacy in the publication of location datasets. For improving the location privacy of users in telecommunication networks, we consider the movement of users in trajectories and investigate the threats that the query history may pose on location privacy. We develop an attack model based on the Viterbi algorithm termed as Viterbi attack, which represents a realistic privacy threat in trajectories. Next, we propose a metric called transition entropy that helps to evaluate the performance of dummy generation algorithms, followed by developing a robust dummy generation algorithm that can defend users against the Viterbi attack. We compare and evaluate our proposed algorithm and metric on a publicly available dataset published by Microsoft, i.e., Geolife dataset. For privacy preserving data publishing, an enhanced framework for anonymization of spatio-temporal trajectory datasets termed the machine learning based anonymization (MLA) is proposed. The framework consists of a robust alignment technique and a machine learning approach for clustering datasets. The framework and all the proposed algorithms are applied to the Geolife dataset, which includes GPS logs of over 180 users in Beijing, China

    User-centric privacy preservation in Internet of Things Networks

    Get PDF
    Recent trends show how the Internet of Things (IoT) and its services are becoming more omnipresent and popular. The end-to-end IoT services that are extensively used include everything from neighborhood discovery to smart home security systems, wearable health monitors, and connected appliances and vehicles. IoT leverages different kinds of networks like Location-based social networks, Mobile edge systems, Digital Twin Networks, and many more to realize these services. Many of these services rely on a constant feed of user information. Depending on the network being used, how this data is processed can vary significantly. The key thing to note is that so much data is collected, and users have little to no control over how extensively their data is used and what information is being used. This causes many privacy concerns, especially for a na ̈ıve user who does not know the implications and consequences of severe privacy breaches. When designing privacy policies, we need to understand the different user data types used in these networks. This includes user profile information, information from their queries used to get services (communication privacy), and location information which is much needed in many on-the-go services. Based on the context of the application, and the service being provided, the user data at risk and the risks themselves vary. First, we dive deep into the networks and understand the different aspects of privacy for user data and the issues faced in each such aspect. We then propose different privacy policies for these networks and focus on two main aspects of designing privacy mechanisms: The quality of service the user expects and the private information from the user’s perspective. The novel contribution here is to focus on what the user thinks and needs instead of fixating on designing privacy policies that only satisfy the third-party applications’ requirement of quality of service
    • …
    corecore