10 research outputs found
Recommended from our members
INFORMATION-THEORETIC LIMITS ON STATISTICAL MATCHING WITH APPLICATIONS TO PRIVACY
Modern applications significantly enhance the user experience by adapting to each user\u27s individual condition and/or preferences. While this adaptation can greatly improve a user\u27s experience or be essential for the application to work, the exposure of user data to the application presents a significant privacy threat to the users- even when the traces are anonymized (since the statistical matching of an anonymized trace to prior user behavior can identify a user and their habits). Because of the current and growing algorithmic and computational capabilities of adversaries, provable privacy guarantees as a function of the degree of anonymization and obfuscation of the traces are necessary. This dissertation focuses on deriving the theoretical bounds on the privacy of users in such a scenario. Here we derive the fundamental limits of user privacy when both anonymization and obfuscation-based protection mechanisms are applied to users\u27 time series of data. We investigate the impact of such mechanisms on the trade-off between privacy protection and user utility. In the first part, the requirements on anonymization and obfuscation in the case that data traces are independent between users are obtained. However, the data traces of different users will be dependent in many applications, and an adversary can potentially exploit such. So in the next part, we consider the impact of dependency between user traces on their privacy. In order to do that, we demonstrate that the adversary can readily identify the association graph of the obfuscated and anonymized version of the data, revealing which user data traces are dependent, and then, we demonstrate that the adversary can use this association graph to break user privacy with significantly shorter traces than in the case of independent users. As a result, we show inter-user dependency degrades user privacy. We show that obfuscating data traces independently across users is often insufficient to remedy such leakage. Therefore, we discuss how users can improve privacy by employing joint obfuscation that removes the data dependency. Finally, we discuss how the remapping technique came to our help to improve user utility and how much remapping is leaking to the adversary when the adversary does not have the full prior information
Asymptotic Loss in Privacy due to Dependency in Gaussian Traces
The rapid growth of the Internet of Things (IoT) necessitates employing
privacy-preserving techniques to protect users' sensitive information. Even
when user traces are anonymized, statistical matching can be employed to infer
sensitive information. In our previous work, we have established the privacy
requirements for the case that the user traces are instantiations of discrete
random variables and the adversary knows only the structure of the dependency
graph, i.e., whether each pair of users is connected. In this paper, we
consider the case where data traces are instantiations of Gaussian random
variables and the adversary knows not only the structure of the graph but also
the pairwise correlation coefficients. We establish the requirements on
anonymization to thwart such statistical matching, which demonstrate the
significant degree to which knowledge of the pairwise correlation coefficients
further significantly aids the adversary in breaking user anonymity.Comment: IEEE Wireless Communications and Networking Conferenc