31,761 research outputs found
Decentralized Differentially Private Without-Replacement Stochastic Gradient Descent
While machine learning has achieved remarkable results in a wide variety of
domains, the training of models often requires large datasets that may need to
be collected from different individuals. As sensitive information may be
contained in the individual's dataset, sharing training data may lead to severe
privacy concerns. Therefore, there is a compelling need to develop
privacy-aware machine learning methods, for which one effective approach is to
leverage the generic framework of differential privacy. Considering that
stochastic gradient descent (SGD) is one of the mostly adopted methods for
large-scale machine learning problems, two decentralized differentially private
SGD algorithms are proposed in this work. Particularly, we focus on SGD without
replacement due to its favorable structure for practical implementation. In
addition, both privacy and convergence analysis are provided for the proposed
algorithms. Finally, extensive experiments are performed to verify the
theoretical results and demonstrate the effectiveness of the proposed
algorithms
Finito: A Faster, Permutable Incremental Gradient Method for Big Data Problems
Recent advances in optimization theory have shown that smooth strongly convex
finite sums can be minimized faster than by treating them as a black box
"batch" problem. In this work we introduce a new method in this class with a
theoretical convergence rate four times faster than existing methods, for sums
with sufficiently many terms. This method is also amendable to a sampling
without replacement scheme that in practice gives further speed-ups. We give
empirical results showing state of the art performance
- …