2,936 research outputs found

    Observation Centric Sensor Data Model

    Get PDF
    Management of sensor data requires metadata to understand the semantics of observations. While e-science researchers have high demands on metadata, they are selective in entering metadata. The claim in this paper is to focus on the essentials, i.e., the actual observations being described by location, time, owner, instrument, and measurement. The applicability of this approach is demonstrated in two very different case studies

    DeepOBS: A Deep Learning Optimizer Benchmark Suite

    Full text link
    Because the choice and tuning of the optimizer affects the speed, and ultimately the performance of deep learning, there is significant past and recent research in this area. Yet, perhaps surprisingly, there is no generally agreed-upon protocol for the quantitative and reproducible evaluation of optimization strategies for deep learning. We suggest routines and benchmarks for stochastic optimization, with special focus on the unique aspects of deep learning, such as stochasticity, tunability and generalization. As the primary contribution, we present DeepOBS, a Python package of deep learning optimization benchmarks. The package addresses key challenges in the quantitative assessment of stochastic optimizers, and automates most steps of benchmarking. The library includes a wide and extensible set of ready-to-use realistic optimization problems, such as training Residual Networks for image classification on ImageNet or character-level language prediction models, as well as popular classics like MNIST and CIFAR-10. The package also provides realistic baseline results for the most popular optimizers on these test problems, ensuring a fair comparison to the competition when benchmarking new optimizers, and without having to run costly experiments. It comes with output back-ends that directly produce LaTeX code for inclusion in academic publications. It supports TensorFlow and is available open source.Comment: Accepted at ICLR 2019. 9 pages, 3 figures, 2 table

    Constrained correlation functions from the Millennium Simulation

    Full text link
    Context. In previous work, we developed a quasi-Gaussian approximation for the likelihood of correlation functions, which, in contrast to the usual Gaussian approach, incorporates fundamental mathematical constraints on correlation functions. The analytical computation of these constraints is only feasible in the case of correlation functions of one-dimensional random fields. Aims. In this work, we aim to obtain corresponding constraints in the case of higher-dimensional random fields and test them in a more realistic context. Methods. We develop numerical methods to compute the constraints on correlation functions which are also applicable for two- and three-dimensional fields. In order to test the accuracy of the numerically obtained constraints, we compare them to the analytical results for the one-dimensional case. Finally, we compute correlation functions from the halo catalog of the Millennium Simulation, check whether they obey the constraints, and examine the performance of the transformation used in the construction of the quasi-Gaussian likelihood. Results. We find that our numerical methods of computing the constraints are robust and that the correlation functions measured from the Millennium Simulation obey them. Despite the fact that the measured correlation functions lie well inside the allowed region of parameter space, i.e. far away from the boundaries of the allowed volume defined by the constraints, we find strong indications that the quasi-Gaussian likelihood yields a substantially more accurate description than the Gaussian one.Comment: 11 pages, 13 figures, updated to match version accepted by A&

    Inelastic Confinement-Induced Resonances in Low-Dimensional Quantum Systems

    Full text link
    A theoretical model is presented describing the confinement-induced resonances observed in the recent loss experiment of Haller et al. [Phys. Rev. Lett. 104, 153203 (2010)]. These resonances originate from possible molecule formation due to the coupling of center-of-mass and relative motion. A corresponding model is verified by ab initio calculations and predicts the resonance positions in 1D as well as in 2D confinement in agreement with the experiment. This resolves the contradiction of the experimental observations to previous theoretical predictions.Comment: 5 pages, 4 figure

    Towards Universally Optimal Shortest Paths Algorithms in the Hybrid Model

    Full text link
    A drawback of the classic approach for complexity analysis of distributed graph problems is that it mostly informs about the complexity of notorious classes of ``worst case'' graphs. Algorithms that are used to prove a tight (existential) bound are essentially optimized to perform well on such worst case graphs. However, such graphs are often either unlikely or actively avoided in practice, where benign graph instances usually admit much faster solutions. To circumnavigate these drawbacks, the concept of universal complexity analysis in the distributed setting was suggested by [Kutten and Peleg, PODC'95] and actively pursued by [Haeupler et al., STOC'21]. Here, the aim is to gauge the complexity of a distributed graph problem depending on the given graph instance. The challenge is to identify and understand the graph property that allows to accurately quantify the complexity of a distributed problem on a given graph. In the present work, we consider distributed shortest paths problems in the HYBRID model of distributed computing, where nodes have simultaneous access to two different modes of communication: one is restricted by locality and the other is restricted by congestion. We identify the graph parameter of neighborhood quality and show that it accurately describes a universal bound for the complexity of certain class of shortest paths problems in the HYBRID model
    corecore