322 research outputs found
Decentralized Matrix Factorization with Heterogeneous Differential Privacy
Conventional matrix factorization relies on centralized collection of users'
data for recommendation, which might introduce an increased risk of privacy
leakage especially when the recommender is untrusted. Existing differentially
private matrix factorization methods either assume the recommender is trusted,
or can only provide a uniform level of privacy protection for all users and
items with untrusted recommender. In this paper, we propose a novel
Heterogeneous Differentially Private Matrix Factorization algorithm (denoted as
HDPMF) for untrusted recommender. To the best of our knowledge, we are the
first to achieve heterogeneous differential privacy for decentralized matrix
factorization in untrusted recommender scenario. Specifically, our framework
uses modified stretching mechanism with an innovative rescaling scheme to
achieve better trade off between privacy and accuracy. Meanwhile, by allocating
privacy budget properly, we can capture homogeneous privacy preference within a
user/item but heterogeneous privacy preference across different users/items.
Theoretical analysis confirms that HDPMF renders rigorous privacy guarantee,
and exhaustive experiments demonstrate its superiority especially in strong
privacy guarantee, high dimension model and sparse dataset scenario.Comment: Accepted by the 22nd IEEE International Conference on Trust, Security
and Privacy in Computing and Communications (TrustCom-2023
Differentially Private Federated Clustering over Non-IID Data
In this paper, we investigate federated clustering (FedC) problem, that aims
to accurately partition unlabeled data samples distributed over massive clients
into finite clusters under the orchestration of a parameter server, meanwhile
considering data privacy. Though it is an NP-hard optimization problem
involving real variables denoting cluster centroids and binary variables
denoting the cluster membership of each data sample, we judiciously reformulate
the FedC problem into a non-convex optimization problem with only one convex
constraint, accordingly yielding a soft clustering solution. Then a novel FedC
algorithm using differential privacy (DP) technique, referred to as DP-FedC, is
proposed in which partial clients participation and multiple local model
updating steps are also considered. Furthermore, various attributes of the
proposed DP-FedC are obtained through theoretical analyses of privacy
protection and convergence rate, especially for the case of non-identically and
independently distributed (non-i.i.d.) data, that ideally serve as the
guidelines for the design of the proposed DP-FedC. Then some experimental
results on two real datasets are provided to demonstrate the efficacy of the
proposed DP-FedC together with its much superior performance over some
state-of-the-art FedC algorithms, and the consistency with all the presented
analytical results.Comment: 31 pages, 4 figures, 1 tabl
- …