30 research outputs found

    Proximal Gradient methods with Adaptive Subspace Sampling

    Get PDF
    Many applications in machine learning or signal processing involve nonsmooth optimization problems. This nonsmoothness brings a low-dimensional structure to the optimal solutions. In this paper, we propose a randomized proximal gradient method harnessing this underlying structure. We introduce two key components: i) a random subspace proximal gradient algorithm; ii) an identification-based sampling of the subspaces. Their interplay brings a significant performance improvement on typical learning problems in terms of dimensions explored

    Distributed Learning with Sparse Communications by Identification

    Full text link
    In distributed optimization for large-scale learning, a major performance limitation comes from the communications between the different entities. When computations are performed by workers on local data while a coordinator machine coordinates their updates to minimize a global loss, we present an asynchronous optimization algorithm that efficiently reduces the communications between the coordinator and workers. This reduction comes from a random sparsification of the local updates. We show that this algorithm converges linearly in the strongly convex case and also identifies optimal strongly sparse solutions. We further exploit this identification to propose an automatic dimension reduction, aptly sparsifying all exchanges between coordinator and workers.Comment: v2 is a significant improvement over v1 (titled "Asynchronous Distributed Learning with Sparse Communications and Identification") with new algorithms, results, and discussion

    Privacy Preserving Randomized Gossip Algorithms

    Get PDF
    In this work we present three different randomized gossip algorithms for solving the average consensus problem while at the same time protecting the information about the initial private values stored at the nodes. We give iteration complexity bounds for all methods, and perform extensive numerical experiments.Comment: 38 page

    Chronicles of nature calendar, a long-term and large-scale multitaxon database on phenology

    Get PDF
    We present an extensive, large-scale, long-term and multitaxon database on phenological and climatic variation, involving 506,186 observation dates acquired in 471 localities in Russian Federation, Ukraine, Uzbekistan, Belarus and Kyrgyzstan. The data cover the period 1890-2018, with 96% of the data being from 1960 onwards. The database is rich in plants, birds and climatic events, but also includes insects, amphibians, reptiles and fungi. The database includes multiple events per species, such as the onset days of leaf unfolding and leaf fall for plants, and the days for first spring and last autumn occurrences for birds. The data were acquired using standardized methods by permanent staff of national parks and nature reserves (87% of the data) and members of a phenological observation network (13% of the data). The database is valuable for exploring how species respond in their phenology to climate change. Large-scale analyses of spatial variation in phenological response can help to better predict the consequences of species and community responses to climate change.Peer reviewe

    Phenological shifts of abiotic events, producers and consumers across a continent

    Get PDF
    Ongoing climate change can shift organism phenology in ways that vary depending on species, habitats and climate factors studied. To probe for large-scale patterns in associated phenological change, we use 70,709 observations from six decades of systematic monitoring across the former Union of Soviet Socialist Republics. Among 110 phenological events related to plants, birds, insects, amphibians and fungi, we find a mosaic of change, defying simple predictions of earlier springs, later autumns and stronger changes at higher latitudes and elevations. Site mean temperature emerged as a strong predictor of local phenology, but the magnitude and direction of change varied with trophic level and the relative timing of an event. Beyond temperature-associated variation, we uncover high variation among both sites and years, with some sites being characterized by disproportionately long seasons and others by short ones. Our findings emphasize concerns regarding ecosystem integrity and highlight the difficulty of predicting climate change outcomes. The authors use systematic monitoring across the former USSR to investigate phenological changes across taxa. The long-term mean temperature of a site emerged as a strong predictor of phenological change, with further imprints of trophic level, event timing, site, year and biotic interactions.Peer reviewe

    Proximal optimization with automatic dimension reduction for large-scale learning

    Full text link
    Dans cette thèse, nous proposons des algorithmes proximaux, avec réduction de dimension automatique, pour des problèmes d’optimisation avec solutions parcimonieuses. Dans un premier temps, nous proposons une méthode générale de réduction de dimension, exploitant la propriété d’identification proximale, par des projections adaptées à la structure de l’itéré courant. Dans le cas parcimonieux, cet algorithme permet de travailler dans des sous-espaces aléatoires de petites dimensions plutôt que dans l’espace entier, possiblement de très grande dimension. Dans un deuxième temps, nous nous plaçons dans un cadre d’optimisation distribuée asynchrone et utilisons la méthode précédente pour réduire la taille des communications entre machines. Nous montrons tout d’abord, que l’application directe de notre méthode de réduction dimension dans ce cadre fonctionne si le problème est bien conditionné. Pour attaquer les problèmes généraux, nous proposons ensuite un reconditionnement proximal donnant ainsi un algorithme avec garanties théorétiques de convergence et de réduction de communications. Des experiences numériques montrent un gain important pour des problèmes classiques fortement parcimonieux.In this thesis, we develop a framework to reduce the dimensionality of composite optimization problems with sparsity inducing regularizers. Based on the identification property of proximal methods, we first develop a ``sketch-and-project'' method that uses projections based on the structure of the correct point. This method allows to work with random low-dimensional subspaces instead of considering the full space in the cases when the final solution is sparse. Second, we place ourselves in the context of the delay-tolerant asynchronous proximal methods and use our dimension reduction technique to decrease the total size of communications. However, this technique is proven to converge only for well-conditioned problems both in theory in practice.Thus, we investigate wrapping it up into a proximal reconditioning framework. This leads to a theoretically backed algorithm that is guaranteed to cost less in terms of communications compared with a non-sparsified version; we show in practice that it implies faster runtime convergence when the sparsity of the problem is sufficiently big
    corecore