16,821 research outputs found
Advances and Open Problems in Federated Learning
Federated learning (FL) is a machine learning setting where many clients
(e.g. mobile devices or whole organizations) collaboratively train a model
under the orchestration of a central server (e.g. service provider), while
keeping the training data decentralized. FL embodies the principles of focused
data collection and minimization, and can mitigate many of the systemic privacy
risks and costs resulting from traditional, centralized machine learning and
data science approaches. Motivated by the explosive growth in FL research, this
paper discusses recent advances and presents an extensive collection of open
problems and challenges.Comment: Published in Foundations and Trends in Machine Learning Vol 4 Issue
1. See: https://www.nowpublishers.com/article/Details/MAL-08
Advances and Open Problems in Federated Learning
Federated learning (FL) is a machine learning setting where many clients (e.g. mobile devices or whole organizations) collaboratively train a model under the orchestration of a central server (e.g. service provider), while keeping the training data decentralized. FL embodies the principles of focused data collection and minimization, and can mitigate many of the systemic privacy risks and costs resulting from traditional, centralized machine learning and data science approaches. Motivated by the explosive growth in FL research, this paper discusses recent advances and presents an extensive collection of open problems and challenges
Spiking Neural Networks -- Part III: Neuromorphic Communications
Synergies between wireless communications and artificial intelligence are
increasingly motivating research at the intersection of the two fields. On the
one hand, the presence of more and more wirelessly connected devices, each with
its own data, is driving efforts to export advances in machine learning (ML)
from high performance computing facilities, where information is stored and
processed in a single location, to distributed, privacy-minded, processing at
the end user. On the other hand, ML can address algorithm and model deficits in
the optimization of communication protocols. However, implementing ML models
for learning and inference on battery-powered devices that are connected via
bandwidth-constrained channels remains challenging. This paper explores two
ways in which Spiking Neural Networks (SNNs) can help address these open
problems. First, we discuss federated learning for the distributed training of
SNNs, and then describe the integration of neuromorphic sensing, SNNs, and
impulse radio technologies for low-power remote inference.Comment: Submitte
- …