385 research outputs found
Recommended from our members
Achieving Perfect Location Privacy in Wireless Devices Using Anonymization
The popularity of mobile devices and location-based services (LBS) have created great concerns regarding the location privacy of the users of such devices and services. Anonymization is a common technique that is often being used to protect the location privacy of LBS users. This technique assigns a random pseudonym to each user and these pseudonyms can change over time. Here, we provide a general information theoretic definition for perfect location privacy and prove that perfect location privacy is achievable for mobile devices when using the anonymization technique appropriately. First, we assume that the user’s current location is independent from her past locations. Using this i.i.d model, we show that if the pseudonym of the user is changed before O(n2/(r−1)) number of anonymized observations is made by the adversary for that user, then she has perfect location privacy, where n is the number of users in the network and r is the number of all possible locations that the user might occupy. Then, we model each user’s movement by a Markov chain so that a user’s current location depends on his previous locations, which is a more realistic model when approximating real world data. We show that perfect location privacy is achievable in this model if the pseudonym of the user is changed before O(n2/(|E|−r)) anonymized observations is collected by the adversary for that user where |E| is the number of edges in the user’s Markov model
Recommended from our members
INFORMATION-THEORETIC LIMITS ON STATISTICAL MATCHING WITH APPLICATIONS TO PRIVACY
Modern applications significantly enhance the user experience by adapting to each user\u27s individual condition and/or preferences. While this adaptation can greatly improve a user\u27s experience or be essential for the application to work, the exposure of user data to the application presents a significant privacy threat to the users- even when the traces are anonymized (since the statistical matching of an anonymized trace to prior user behavior can identify a user and their habits). Because of the current and growing algorithmic and computational capabilities of adversaries, provable privacy guarantees as a function of the degree of anonymization and obfuscation of the traces are necessary. This dissertation focuses on deriving the theoretical bounds on the privacy of users in such a scenario. Here we derive the fundamental limits of user privacy when both anonymization and obfuscation-based protection mechanisms are applied to users\u27 time series of data. We investigate the impact of such mechanisms on the trade-off between privacy protection and user utility. In the first part, the requirements on anonymization and obfuscation in the case that data traces are independent between users are obtained. However, the data traces of different users will be dependent in many applications, and an adversary can potentially exploit such. So in the next part, we consider the impact of dependency between user traces on their privacy. In order to do that, we demonstrate that the adversary can readily identify the association graph of the obfuscated and anonymized version of the data, revealing which user data traces are dependent, and then, we demonstrate that the adversary can use this association graph to break user privacy with significantly shorter traces than in the case of independent users. As a result, we show inter-user dependency degrades user privacy. We show that obfuscating data traces independently across users is often insufficient to remedy such leakage. Therefore, we discuss how users can improve privacy by employing joint obfuscation that removes the data dependency. Finally, we discuss how the remapping technique came to our help to improve user utility and how much remapping is leaking to the adversary when the adversary does not have the full prior information
On the Anonymization of Differentially Private Location Obfuscation
Obfuscation techniques in location-based services (LBSs) have been shown
useful to hide the concrete locations of service users, whereas they do not
necessarily provide the anonymity. We quantify the anonymity of the location
data obfuscated by the planar Laplacian mechanism and that by the optimal
geo-indistinguishable mechanism of Bordenabe et al. We empirically show that
the latter provides stronger anonymity than the former in the sense that more
users in the database satisfy k-anonymity. To formalize and analyze such
approximate anonymity we introduce the notion of asymptotic anonymity. Then we
show that the location data obfuscated by the optimal geo-indistinguishable
mechanism can be anonymized by removing a smaller number of users from the
database. Furthermore, we demonstrate that the optimal geo-indistinguishable
mechanism has better utility both for users and for data analysts.Comment: ISITA'18 conference pape
- …