68 research outputs found

    Nested Distributed Gradient Methods with Adaptive Quantized Communication

    Full text link
    In this paper, we consider minimizing a sum of local convex objective functions in a distributed setting, where communication can be costly. We propose and analyze a class of nested distributed gradient methods with adaptive quantized communication (NEAR-DGD+Q). We show the effect of performing multiple quantized communication steps on the rate of convergence and on the size of the neighborhood of convergence, and prove R-Linear convergence to the exact solution with increasing number of consensus steps and adaptive quantization. We test the performance of the method, as well as some practical variants, on quadratic functions, and show the effects of multiple quantized communication steps in terms of iterations/gradient evaluations, communication and cost.Comment: 9 pages, 2 figures. arXiv admin note: text overlap with arXiv:1709.0299

    Fuzzy-based Augmentation of Federated Averaging for Enhanced Decentralized Machine Learning

    Get PDF
    Federated Averaging (FedAvg) is a leading decentralized machine learning approach, prioritizing data privacy. However, it faces challenges like non-identically distributed data, communication bottlenecks, and adversarial attacks. This abstract introduces a fuzzy-based FedAvg, leveraging fuzzy logic to manage uncertainty in decentralized environments. Fuzzy clustering adapts the model to varied data distributions, addressing non-IID challenges. Fuzzy membership functions enhance aggregation by introducing an adaptive weighting scheme, improving convergence and accuracy. The fuzzy approach incorporates privacy-preserving mechanisms, ensuring secure aggregation with homomorphic encryption and differential privacy. Simulations show improved convergence, resilience to non-IID data, and enhanced privacy compared to traditional FedAvg, contributing to more secure decentralized ML systems
    • …
    corecore