1 research outputs found
Privacy of Dependent Users Against Statistical Matching
Modern applications significantly enhance user experience by adapting to each
user's individual condition and/or preferences. While this adaptation can
greatly improve a user's experience or be essential for the application to
work, the exposure of user data to the application presents a significant
privacy threat to the users\textemdash even when the traces are
anonymized\textemdash since the statistical matching of an anonymized trace to
prior user behavior can identify a user and their habits. Because of the
current and growing algorithmic and computational capabilities of adversaries,
provable privacy guarantees as a function of the degree of anonymization and
obfuscation of the traces are necessary. Our previous work has established the
requirements on anonymization and obfuscation in the case that data traces are
independent between users. However, the data traces of different users will be
dependent in many applications, and an adversary can potentially exploit such.
In this paper, we consider the impact of dependency between user traces on
their privacy. First, we demonstrate that the adversary can readily identify
the association graph of the obfuscated and anonymized version of the data,
revealing which user data traces are dependent. Next, we demonstrate that the
adversary can use this association graph to break user privacy with
significantly shorter traces than in the case of independent users, and that
obfuscating data traces independently across users is often insufficient to
remedy such leakage. Finally, we discuss how users can improve privacy by
employing joint obfuscation that removes or reduces the data dependency.Comment: Submitted to IEEE Transaction on Information Theor