462 research outputs found

    FedHybrid: A Hybrid Primal-Dual Algorithm Framework for Federated Optimization

    Full text link
    We consider a multi-agent consensus optimization problem over a server-client (federated) network, where all clients are connected to a central server. Current distributed algorithms fail to capture the heterogeneity in clients' local computation capacities. Motivated by the generalized Method of Multipliers in centralized optimization, we derive an approximate Newton-type primal-dual method with a practical distributed implementation by utilizing the server-client topology. Then we propose a new primal-dual algorithm framework FedHybrid that allows different clients to perform various types of updates. Specifically, each client can choose to perform either gradient-type or Newton-type updates. We propose a novel analysis framework for primal-dual methods and obtain a linear convergence rate of FedHybrid for strongly convex functions, regardless of clients' choices of gradient-type or Newton-type updates. Numerical studies are provided to demonstrate the efficacy of our method in practice. To the best of our knowledge, this is the first hybrid algorithmic framework allowing heterogeneous local updates for distributed consensus optimization with a provable convergence and rate guarantee

    DISH: A Distributed Hybrid Optimization Method Leveraging System Heterogeneity

    Full text link
    We study distributed optimization problems over multi-agent networks, including consensus and network flow problems. Existing distributed methods neglect the heterogeneity among agents' computational capabilities, limiting their effectiveness. To address this, we propose DISH, a distributed hybrid method that leverages system heterogeneity. DISH allows agents with higher computational capabilities or lower computational costs to perform local Newton-type updates while others adopt simpler gradient-type updates. Notably, DISH covers existing methods like EXTRA, DIGing, and ESOM-0 as special cases. To analyze DISH's performance with general update directions, we formulate distributed problems as minimax problems and introduce GRAND (gradient-related ascent and descent) and its alternating version, Alt-GRAND, for solving these problems. GRAND generalizes DISH to centralized minimax settings, accommodating various descent ascent update directions, including gradient-type, Newton-type, scaled gradient, and other general directions, within acute angles to the partial gradients. Theoretical analysis establishes global sublinear and linear convergence rates for GRAND and Alt-GRAND in strongly-convex-nonconcave and strongly-convex-PL settings, providing linear rates for DISH. In addition, we derive the local superlinear convergence of Newton-based variations of GRAND in centralized settings. Numerical experiments validate the effectiveness of our methods

    DISH: A Distributed Hybrid Primal-Dual Optimization Framework to Utilize System Heterogeneity

    Full text link
    We consider solving distributed consensus optimization problems over multi-agent networks. Current distributed methods fail to capture the heterogeneity among agents' local computation capacities. We propose DISH as a distributed hybrid primal-dual algorithmic framework to handle and utilize system heterogeneity. Specifically, DISH allows those agents with higher computational capabilities or cheaper computational costs to implement Newton-type updates locally, while other agents can adopt the much simpler gradient-type updates. We show that DISH is a general framework and includes EXTRA, DIGing, and ESOM-0 as special cases. Moreover, when all agents take both primal and dual Newton-type updates, DISH approximates Newton's method by estimating both primal and dual Hessians. Theoretically, we show that DISH achieves a linear (Q-linear) convergence rate to the exact optimal solution for strongly convex functions, regardless of agents' choices of gradient-type and Newton-type updates. Finally, we perform numerical studies to demonstrate the efficacy of DISH in practice. To the best of our knowledge, DISH is the first hybrid method allowing heterogeneous local updates for distributed consensus optimization under general network topology with provable convergence and rate guarantees

    New isoforms and assembly of glutamine synthetase in the leaf of wheat (Triticum aestivum L.).

    Get PDF
    Glutamine synthetase (GS; EC 6.3.1.2) plays a crucial role in the assimilation and re-assimilation of ammonia derived from a wide variety of metabolic processes during plant growth and development. Here, three developmentally regulated isoforms of GS holoenzyme in the leaf of wheat (Triticum aestivum L.) seedlings are described using native-PAGE with a transferase activity assay. The isoforms showed different mobilities in gels, with GSII>GSIII>GSI. The cytosolic GSI was composed of three subunits, GS1, GSr1, and GSr2, with the same molecular weight (39.2kDa), but different pI values. GSI appeared at leaf emergence and was active throughout the leaf lifespan. GSII and GSIII, both located in the chloroplast, were each composed of a single 42.1kDa subunit with different pI values. GSII was active mainly in green leaves, while GSIII showed brief but higher activity in green leaves grown under field conditions. LC-MS/MS experiments revealed that GSII and GSIII have the same amino acid sequence, but GSII has more modification sites. With a modified blue native electrophoresis (BNE) technique and in-gel catalytic activity analysis, only two GS isoforms were observed: one cytosolic and one chloroplastic. Mass calibrations on BNE gels showed that the cytosolic GS1 holoenzyme was ~490kDa and likely a dodecamer, and the chloroplastic GS2 holoenzyme was ~240kDa and likely a hexamer. Our experimental data suggest that the activity of GS isoforms in wheat is regulated by subcellular localization, assembly, and modification to achieve their roles during plant development

    Electromagnetic Scattering by Open-Ended Cavities: An Analysis Using Precorrected-FFT Approach

    Get PDF
    In this paper, the precorrected-FFT method is used to solve the electromagnetic scattering from two-dimensional cavities of arbitrary shape. The integral equation is discretized by the method of moments and the resultant matrix equation is solved iteratively by the generalized conjugate residual method. Instead of directly computing the matrix-vector multiplication, which requires N² operations, this approach reduces the computation complexity to O(N log N) as well as avoids the storage of large matrices. At the same time, a technique known as the complexifying k is applied to accelerate the convergence of the iterative method in solving this resonance problem. Some examples are considered and excellent agreements of radar cross sections between these computed using the present method and those from the direct solution are observed, demonstrating the feasibility and efficiency of the present method.Singapore-MIT Alliance (SMA

    Exact Community Recovery in the Geometric SBM

    Full text link
    We study the problem of exact community recovery in the Geometric Stochastic Block Model (GSBM), where each vertex has an unknown community label as well as a known position, generated according to a Poisson point process in Rd\mathbb{R}^d. Edges are formed independently conditioned on the community labels and positions, where vertices may only be connected by an edge if they are within a prescribed distance of each other. The GSBM thus favors the formation of dense local subgraphs, which commonly occur in real-world networks, a property that makes the GSBM qualitatively very different from the standard Stochastic Block Model (SBM). We propose a linear-time algorithm for exact community recovery, which succeeds down to the information-theoretic threshold, confirming a conjecture of Abbe, Baccelli, and Sankararaman. The algorithm involves two phases. The first phase exploits the density of local subgraphs to propagate estimated community labels among sufficiently occupied subregions, and produces an almost-exact vertex labeling. The second phase then refines the initial labels using a Poisson testing procedure. Thus, the GSBM enjoys local to global amplification just as the SBM, with the advantage of admitting an information-theoretically optimal, linear-time algorithm
    • …
    corecore