4,436 research outputs found

    Geoscience after IT: Part F. Familiarization with quantitative analysis

    Get PDF
    Numbers, measurement and calculation extend our view of the world. Statistical methods describe the properties of sets of quantitative data, and can test models (particularly the model that observed relationships arose by chance) and help us to draw conclusions. Links between spatial and quantitative methods, through coordinate geometry and matrix algebra, lead to graphical representations for visualizing and exploring relationships. Multivariate statistics tie into visualization to look at pattern among many properties

    Conglomerate Multi-Fidelity Gaussian Process Modeling, with Application to Heavy-Ion Collisions

    Full text link
    In an era where scientific experimentation is often costly, multi-fidelity emulation provides a powerful tool for predictive scientific computing. While there has been notable work on multi-fidelity modeling, existing models do not incorporate an important ``conglomerate'' property of multi-fidelity simulators, where the accuracies of different simulator components (modeling separate physics) are controlled by different fidelity parameters. Such conglomerate simulators are widely encountered in complex nuclear physics and astrophysics applications. We thus propose a new CONglomerate multi-FIdelity Gaussian process (CONFIG) model, which embeds this conglomerate structure within a novel non-stationary covariance function. We show that the proposed CONFIG model can capture prior knowledge on the numerical convergence of conglomerate simulators, which allows for cost-efficient emulation of multi-fidelity systems. We demonstrate the improved predictive performance of CONFIG over state-of-the-art models in a suite of numerical experiments and two applications, the first for emulation of cantilever beam deflection and the second for emulating the evolution of the quark-gluon plasma, which was theorized to have filled the Universe shortly after the Big Bang

    Tri-level decision-making with multiple followers: Model, algorithm and case study

    Full text link
    © 2015 Elsevier Inc. Tri-level decision-making arises to address compromises among interacting decision entities distributed throughout a three-level hierarchy; these entities are respectively termed the top-level leader, the middle-level follower and the bottom-level follower. This study considers an uncooperative situation where multiple followers at the same (middle or bottom) level make their individual decisions independently but consider the decision results of their counterparts as references through information exchanged among themselves. This situation is called a reference-based uncooperative multi-follower tri-level (MFTL) decision problem which appears in many real-world applications. To solve this problem, we need to find an optimal solution achieving both the Stackelberg equilibrium in the three-level vertical structure and the Nash equilibrium among multiple followers at the same horizontal level. In this paper, we first propose a general linear MFTL decision model for this situation. We then develop a MFTL Kth-Best algorithm to find an optimal solution to the model. Since the optimal solution means a compromised result in the uncooperative situation and it is often imprecise or ambiguous for decision entities to identify their related satisfaction, we use a fuzzy programming approach to characterize and evaluate the solution obtained. Lastly, a real-world case study on production-inventory planning illustrates the effectiveness of the proposed MFTL decision techniques

    Analysis and optimization of material flow inside the system of rotary coolers and intake pipeline via discrete element method modelling

    Get PDF
    There is hardly any industry that does not use transport, storage, and processing of particulate solids in its production process. In the past, all device designs were based on empirical relationships or the designer's experience. In the field of particulate solids, however, the discrete element method (DEM) has been increasingly used in recent years. This study shows how this simulation tool can be used in practice. More specifically, in dealing with operating problems with a rotary cooler which ensures the transport and cooling of the hot fly ash generated by combustion in fluidized bed boilers. For the given operating conditions, an analysis of the current cooling design was carried out, consisting of a non-standard intake pipeline, which divides and supplies the material to two rotary coolers. The study revealed shortcomings in both the pipeline design and the cooler design. The material was unevenly dispensed between the two coolers, which combined with the limited transport capacity of the coolers, led to overflowing and congestion of the whole system. Therefore, after visualization of the material flow and export of the necessary data using DEM design measures to mitigate these unwanted phenomena were carried out.Web of Science117art. no. 184

    Emergent complex neural dynamics

    Full text link
    A large repertoire of spatiotemporal activity patterns in the brain is the basis for adaptive behaviour. Understanding the mechanism by which the brain's hundred billion neurons and hundred trillion synapses manage to produce such a range of cortical configurations in a flexible manner remains a fundamental problem in neuroscience. One plausible solution is the involvement of universal mechanisms of emergent complex phenomena evident in dynamical systems poised near a critical point of a second-order phase transition. We review recent theoretical and empirical results supporting the notion that the brain is naturally poised near criticality, as well as its implications for better understanding of the brain
    • 

    corecore