404,320 research outputs found
Anonymous network access using the digital marketplace
With increasing usage of mobile telephony, and the trend towards additional mobile Internet usage, privacy and anonymity become more and more important. Previously-published anonymous communication schemes aim to obscure their users' network addresses, because real-world identity can be easily be derived from this information. We propose modifications to a novel call-management architecture, the digital marketplace, which will break this link, therefore enabling truly anonymous network access
Provably-secure symmetric private information retrieval with quantum cryptography
Private information retrieval (PIR) is a database query protocol that
provides user privacy, in that the user can learn a particular entry of the
database of his interest but his query would be hidden from the data centre.
Symmetric private information retrieval (SPIR) takes PIR further by
additionally offering database privacy, where the user cannot learn any
additional entries of the database. Unconditionally secure SPIR solutions with
multiple databases are known classically, but are unrealistic because they
require long shared secret keys between the parties for secure communication
and shared randomness in the protocol. Here, we propose using quantum key
distribution (QKD) instead for a practical implementation, which can realise
both the secure communication and shared randomness requirements. We prove that
QKD maintains the security of the SPIR protocol and that it is also secure
against any external eavesdropper. We also show how such a classical-quantum
system could be implemented practically, using the example of a two-database
SPIR protocol with keys generated by measurement device-independent QKD.
Through key rate calculations, we show that such an implementation is feasible
at the metropolitan level with current QKD technology.Comment: 19 page
"Am I Private and If So, how Many?" - Communicating Privacy Guarantees of Differential Privacy with Risk Communication Formats
Decisions about sharing personal information are not trivial, since there are
many legitimate and important purposes for such data collection, but often the
collected data can reveal sensitive information about individuals.
Privacy-preserving technologies, such as differential privacy (DP), can be
employed to protect the privacy of individuals and, furthermore, provide
mathematically sound guarantees on the maximum privacy risk. However, they can
only support informed privacy decisions, if individuals understand the provided
privacy guarantees. This article proposes a novel approach for communicating
privacy guarantees to support individuals in their privacy decisions when
sharing data. For this, we adopt risk communication formats from the medical
domain in conjunction with a model for privacy guarantees of DP to create
quantitative privacy risk notifications. We conducted a crowd-sourced study
with 343 participants to evaluate how well our notifications conveyed the
privacy risk information and how confident participants were about their own
understanding of the privacy risk. Our findings suggest that these new
notifications can communicate the objective information similarly well to
currently used qualitative notifications, but left individuals less confident
in their understanding. We also discovered that several of our notifications
and the currently used qualitative notification disadvantage individuals with
low numeracy: these individuals appear overconfident compared to their actual
understanding of the associated privacy risks and are, therefore, less likely
to seek the needed additional information before an informed decision. The
promising results allow for multiple directions in future research, for
example, adding visual aids or tailoring privacy risk communication to
characteristics of the individuals.Comment: Accepted to ACM CCS 2022. arXiv admin note: substantial text overlap
with arXiv:2204.0406
PEPSI: Privacy-Enhanced Participatory Sensing Infrastructure.
Participatory Sensing combines the ubiquity of mobile phones with sensing capabilities of Wireless Sensor Networks. It targets pervasive collection of information, e.g., temperature, traffic conditions, or health-related data. As users produce measurements from their mobile devices, voluntary participation becomes essential. However, a number of privacy concerns -- due to the personal information conveyed by data reports -- hinder large-scale deployment of participatory sensing applications. Prior work on privacy protection, for participatory sensing, has often relayed on unrealistic assumptions and with no provably-secure guarantees.
The goal of this project is to introduce PEPSI: a Privacy-Enhanced Participatory Sensing Infrastructure. We explore realistic architectural assumptions and a minimal set of (formal) privacy requirements, aiming at protecting privacy of both data producers and consumers. We design a solution that attains privacy guarantees with provable security at very low additional computational cost and almost no extra communication overhead
Privacy-Friendly Mobility Analytics using Aggregate Location Data
Location data can be extremely useful to study commuting patterns and
disruptions, as well as to predict real-time traffic volumes. At the same time,
however, the fine-grained collection of user locations raises serious privacy
concerns, as this can reveal sensitive information about the users, such as,
life style, political and religious inclinations, or even identities. In this
paper, we study the feasibility of crowd-sourced mobility analytics over
aggregate location information: users periodically report their location, using
a privacy-preserving aggregation protocol, so that the server can only recover
aggregates -- i.e., how many, but not which, users are in a region at a given
time. We experiment with real-world mobility datasets obtained from the
Transport For London authority and the San Francisco Cabs network, and present
a novel methodology based on time series modeling that is geared to forecast
traffic volumes in regions of interest and to detect mobility anomalies in
them. In the presence of anomalies, we also make enhanced traffic volume
predictions by feeding our model with additional information from correlated
regions. Finally, we present and evaluate a mobile app prototype, called
Mobility Data Donors (MDD), in terms of computation, communication, and energy
overhead, demonstrating the real-world deployability of our techniques.Comment: Published at ACM SIGSPATIAL 201
How to Securely Compute the Modulo-Two Sum of Binary Sources
In secure multiparty computation, mutually distrusting users in a network
want to collaborate to compute functions of data which is distributed among the
users. The users should not learn any additional information about the data of
others than what they may infer from their own data and the functions they are
computing. Previous works have mostly considered the worst case context (i.e.,
without assuming any distribution for the data); Lee and Abbe (2014) is a
notable exception. Here, we study the average case (i.e., we work with a
distribution on the data) where correctness and privacy is only desired
asymptotically.
For concreteness and simplicity, we consider a secure version of the function
computation problem of K\"orner and Marton (1979) where two users observe a
doubly symmetric binary source with parameter p and the third user wants to
compute the XOR. We show that the amount of communication and randomness
resources required depends on the level of correctness desired. When zero-error
and perfect privacy are required, the results of Data et al. (2014) show that
it can be achieved if and only if a total rate of 1 bit is communicated between
every pair of users and private randomness at the rate of 1 is used up. In
contrast, we show here that, if we only want the probability of error to vanish
asymptotically in block length, it can be achieved by a lower rate (binary
entropy of p) for all the links and for private randomness; this also
guarantees perfect privacy. We also show that no smaller rates are possible
even if privacy is only required asymptotically.Comment: 6 pages, 1 figure, extended version of submission to IEEE Information
Theory Workshop, 201
FedRec+: Enhancing Privacy and Addressing Heterogeneity in Federated Recommendation Systems
Preserving privacy and reducing communication costs for edge users pose
significant challenges in recommendation systems. Although federated learning
has proven effective in protecting privacy by avoiding data exchange between
clients and servers, it has been shown that the server can infer user ratings
based on updated non-zero gradients obtained from two consecutive rounds of
user-uploaded gradients. Moreover, federated recommendation systems (FRS) face
the challenge of heterogeneity, leading to decreased recommendation
performance. In this paper, we propose FedRec+, an ensemble framework for FRS
that enhances privacy while addressing the heterogeneity challenge. FedRec+
employs optimal subset selection based on feature similarity to generate
near-optimal virtual ratings for pseudo items, utilizing only the user's local
information. This approach reduces noise without incurring additional
communication costs. Furthermore, we utilize the Wasserstein distance to
estimate the heterogeneity and contribution of each client, and derive optimal
aggregation weights by solving a defined optimization problem. Experimental
results demonstrate the state-of-the-art performance of FedRec+ across various
reference datasets.Comment: Accepted by 59th Annual Allerton Conference on Communication,
Control, and Computin
- …