27 research outputs found

    Privacy through uncertainty in location-based services

    Get PDF
    Location-Based Services (LBS) are becoming more prevalent. While there are many benefits, there are also real privacy risks. People are unwilling to give up the benefits - but can we reduce privacy risks without giving up on LBS entirely? This paper explores the possibility of introducing uncertainty into location information when using an LBS, so as to reduce privacy risk while maintaining good quality of service. This paper also explores the current uses of uncertainty information in a selection of mobile applications

    A Privacy Protection Mechanism for Mobile Online Social Networks

    Full text link
    A Location sharing system is the most critical component in mobile online social networks (MOSNS).Huge number of user\u27s location information will be stored by the service providers. In addition to the location privacy and social network privacy cannot be guaranteed to the user in the earlier work. Regarding the enhanced privacy against the inside attacker implemented by the service provider in (MOSNS), we initiate a new architecture with multiple servers .It introduces a protected solution which supports a location sharing among friends and strangers. The user friend set in each query is submitted to the location server it divides into multiple subset by the location server. If the user makes a query to the server the data can be retrieved only for the registered users instead of all. We use Three Layer of Security likely, High, Medium and Low for the Privacy implementation. Simultaneously with a location sharing it offers check ability of the searching results reoccurred from the servers. We also prove that the new construction is safe under the stronger security model with enhanced privacy

    On the Anonymization of Differentially Private Location Obfuscation

    Full text link
    Obfuscation techniques in location-based services (LBSs) have been shown useful to hide the concrete locations of service users, whereas they do not necessarily provide the anonymity. We quantify the anonymity of the location data obfuscated by the planar Laplacian mechanism and that by the optimal geo-indistinguishable mechanism of Bordenabe et al. We empirically show that the latter provides stronger anonymity than the former in the sense that more users in the database satisfy k-anonymity. To formalize and analyze such approximate anonymity we introduce the notion of asymptotic anonymity. Then we show that the location data obfuscated by the optimal geo-indistinguishable mechanism can be anonymized by removing a smaller number of users from the database. Furthermore, we demonstrate that the optimal geo-indistinguishable mechanism has better utility both for users and for data analysts.Comment: ISITA'18 conference pape

    Search Me If You Can: Privacy-preserving Location Query Service

    Full text link
    Location-Based Service (LBS) becomes increasingly popular with the dramatic growth of smartphones and social network services (SNS), and its context-rich functionalities attract considerable users. Many LBS providers use users' location information to offer them convenience and useful functions. However, the LBS could greatly breach personal privacy because location itself contains much information. Hence, preserving location privacy while achieving utility from it is still an challenging question now. This paper tackles this non-trivial challenge by designing a suite of novel fine-grained Privacy-preserving Location Query Protocol (PLQP). Our protocol allows different levels of location query on encrypted location information for different users, and it is efficient enough to be applied in mobile platforms.Comment: 9 pages, 1 figure, 2 tables, IEEE INFOCOM 201

    On Safeguarding Privacy and Security in the Framework of Federated Learning

    Full text link
    Motivated by the advancing computational capacity of wireless end-user equipment (UE), as well as the increasing concerns about sharing private data, a new machine learning (ML) paradigm has emerged, namely federated learning (FL). Specifically, FL allows a decoupling of data provision at UEs and ML model aggregation at a central unit. By training model locally, FL is capable of avoiding data leakage from the UEs, thereby preserving privacy and security to some extend. However, even if raw data are not disclosed from UEs, individual's private information can still be extracted by some recently discovered attacks in the FL architecture. In this work, we analyze the privacy and security issues in FL, and raise several challenges on preserving privacy and security when designing FL systems. In addition, we provide extensive simulation results to illustrate the discussed issues and possible solutions.Comment: This paper has been accepted by IEEE Network Magazin

    A Location Privacy Extension for DVB-RCS

    Get PDF
    In this paper we studied the DVB-RCS (Return Channel through Satellite) standard from a privacy perspective and proposed an approach to incorporate a location privacy enhancing mechanism into the standard. Offering location based privacy in DVB-RCS communication is a challenge as the location of a satellite terminal must be revealed to the network operator of the DVB-RCS network for technical and administrative reasons. We proposed an approach of cloaking the location by intentionally compromising its accuracy whilst maintaining the operability and integrity of the communications system. In addition we implemented a proof of concept technique utilizing the theoretical findings of this work on a real DVB-RCS system, presenting the methodology along with the tools used and the experimental results

    Optimal Geo-Indistinguishable Mechanisms for Location Privacy

    Full text link
    We consider the geo-indistinguishability approach to location privacy, and the trade-off with respect to utility. We show that, given a desired degree of geo-indistinguishability, it is possible to construct a mechanism that minimizes the service quality loss, using linear programming techniques. In addition we show that, under certain conditions, such mechanism also provides optimal privacy in the sense of Shokri et al. Furthermore, we propose a method to reduce the number of constraints of the linear program from cubic to quadratic, maintaining the privacy guarantees and without affecting significantly the utility of the generated mechanism. This reduces considerably the time required to solve the linear program, thus enlarging significantly the location sets for which the optimal mechanisms can be computed.Comment: 13 page

    Quantifying privacy in terms of entropy for context aware services

    Get PDF
    corecore