97,307 research outputs found

    Making It to the Major Leagues: Career Movement between Library and Archival Professions and from Small College to Large University Libraries

    Get PDF
    published or submitted for publicatio

    Grid infrastructures supporting paediatric endocrinology across Europe

    Get PDF
    Paediatric endocrinology is a highly specialised area of clinical medicine with many experts with specific knowledge distributed over a wide geographical area. The European Society for Paediatric Endocrinology (ESPE) is an example of such a body of experts that require regular collaboration and sharing of data and knowledge. This paper describes work, developed as a corollary to the VOTES project [1] and implementing similar architectures, to provide a data grid that allows information to be efficiently distributed between collaborating partners, and also allows wide-scale analyses to be run over the entire data-set, which necessarily involves crossing domain boundaries and negotiating data access between administrations that only trust each other to a limited degree

    Neural Architecture Search using Deep Neural Networks and Monte Carlo Tree Search

    Full text link
    Neural Architecture Search (NAS) has shown great success in automating the design of neural networks, but the prohibitive amount of computations behind current NAS methods requires further investigations in improving the sample efficiency and the network evaluation cost to get better results in a shorter time. In this paper, we present a novel scalable Monte Carlo Tree Search (MCTS) based NAS agent, named AlphaX, to tackle these two aspects. AlphaX improves the search efficiency by adaptively balancing the exploration and exploitation at the state level, and by a Meta-Deep Neural Network (DNN) to predict network accuracies for biasing the search toward a promising region. To amortize the network evaluation cost, AlphaX accelerates MCTS rollouts with a distributed design and reduces the number of epochs in evaluating a network by transfer learning, which is guided with the tree structure in MCTS. In 12 GPU days and 1000 samples, AlphaX found an architecture that reaches 97.84\% top-1 accuracy on CIFAR-10, and 75.5\% top-1 accuracy on ImageNet, exceeding SOTA NAS methods in both the accuracy and sampling efficiency. Particularly, we also evaluate AlphaX on NASBench-101, a large scale NAS dataset; AlphaX is 3x and 2.8x more sample efficient than Random Search and Regularized Evolution in finding the global optimum. Finally, we show the searched architecture improves a variety of vision applications from Neural Style Transfer, to Image Captioning and Object Detection.Comment: To appear in the Thirty-Fourth AAAI conference on Artificial Intelligence (AAAI-2020

    A General Analysis of the Convergence of ADMM

    Full text link
    We provide a new proof of the linear convergence of the alternating direction method of multipliers (ADMM) when one of the objective terms is strongly convex. Our proof is based on a framework for analyzing optimization algorithms introduced in Lessard et al. (2014), reducing algorithm convergence to verifying the stability of a dynamical system. This approach generalizes a number of existing results and obviates any assumptions about specific choices of algorithm parameters. On a numerical example, we demonstrate that minimizing the derived bound on the convergence rate provides a practical approach to selecting algorithm parameters for particular ADMM instances. We complement our upper bound by constructing a nearly-matching lower bound on the worst-case rate of convergence.Comment: 10 pages, 6 figure
    • …
    corecore