4,436 research outputs found
Geoscience after IT: Part F. Familiarization with quantitative analysis
Numbers, measurement and calculation extend our view of the world. Statistical methods describe the properties of sets of quantitative data, and can test models (particularly the model that observed relationships arose by chance) and help us to draw conclusions. Links between spatial and quantitative methods, through coordinate geometry and matrix algebra, lead to graphical representations for visualizing and exploring relationships. Multivariate statistics tie into visualization to look at pattern among many properties
Conglomerate Multi-Fidelity Gaussian Process Modeling, with Application to Heavy-Ion Collisions
In an era where scientific experimentation is often costly, multi-fidelity
emulation provides a powerful tool for predictive scientific computing. While
there has been notable work on multi-fidelity modeling, existing models do not
incorporate an important ``conglomerate'' property of multi-fidelity
simulators, where the accuracies of different simulator components (modeling
separate physics) are controlled by different fidelity parameters. Such
conglomerate simulators are widely encountered in complex nuclear physics and
astrophysics applications. We thus propose a new CONglomerate multi-FIdelity
Gaussian process (CONFIG) model, which embeds this conglomerate structure
within a novel non-stationary covariance function. We show that the proposed
CONFIG model can capture prior knowledge on the numerical convergence of
conglomerate simulators, which allows for cost-efficient emulation of
multi-fidelity systems. We demonstrate the improved predictive performance of
CONFIG over state-of-the-art models in a suite of numerical experiments and two
applications, the first for emulation of cantilever beam deflection and the
second for emulating the evolution of the quark-gluon plasma, which was
theorized to have filled the Universe shortly after the Big Bang
Recommended from our members
Bitcoin: the wrong implementation of the right idea at the right time
This paper is a study into some of the regulatory implications of cryptocurrencies using the CAMPO research framework (Context, Actors, Methods, Methods, Practice, Outcomes). We explain in CAMPO format why virtual currencies are of interest, how self-regulation has failed, and what useful lessons can be learned. We are hopeful that the full paper will produce useful and semi-permanent findings into the usefulness of virtual currencies in general, block chains as a means of mining currency, and the profundity of current âmedia darlingâ currency Bitcoin as compared with the development of block chain generator Ethereum.
While virtual currencies can play a role in creating better trading conditions in virtual communities, despite the risks of non-sovereign issuance and therefore only regulation by code (Brown/Marsden 2013), the methodology used poses significant challenges to researching this âcommunityâ, if BitCoin can even be said to have created a single community, as opposed to enabling an alternate method of exchange for potentially all virtual community transactions. First, BitCoin users have transparency of ownership but anonymity in many transactions, necessary for libertarians or outright criminals in such illicit markets as #SilkRoad. Studying community dynamics is therefore made much more difficult than even such pseudonymous or avatar based communities as Habbo Hotel, World of Warcraft or SecondLife. The ethical implications of studying such communities raise similar problems as those of Tor, Anonymous, Lulzsec and other anonymous hacker communities. Second, the journalistic accounts of BitCoin markets are subject to sensationalism, hype and inaccuracy, even more so than in the earlier hype cycle for SecondLife, exacerbated by the first issue of anonymity. Third, the virtual currency area is subject to slowly emerging regulation by financial authorities and police forces, which appears to be driving much of the early adopter community âundergroundâ. Thus, the community in 2016 may not bear much resemblance to that in 2012. Fourth, there has been relatively little academic empirical study of the community, or indeed of virtual currencies in general, until relatively recently. Fifth, the dynamism of the virtual currency environment in the face of the deepening mistrust of the financial system after the 2008 crisis is such that any research conclusions must by their nature be provisional and transient.
All these challenges, particularly the final three, also raise the motivation for research â an alternative financial system which is separated from the real-world sovereign and which can use code regulation with limited enforcement from offline policing, both returns the study to the libertarian self-regulated environment of early 1990s MUDs, and offers a tantalising prospect of a tool to evade the perils of âprivate profit, socialized riskâ which existing large financial institutions created in the 2008-12 disaster. The need for further research into virtual currencies based on blockchain mining, and for their usage by virtual communities, is thus pressing and should motivate researchers to solve the many problems in methodology for exploring such an environment
Tri-level decision-making with multiple followers: Model, algorithm and case study
© 2015 Elsevier Inc. Tri-level decision-making arises to address compromises among interacting decision entities distributed throughout a three-level hierarchy; these entities are respectively termed the top-level leader, the middle-level follower and the bottom-level follower. This study considers an uncooperative situation where multiple followers at the same (middle or bottom) level make their individual decisions independently but consider the decision results of their counterparts as references through information exchanged among themselves. This situation is called a reference-based uncooperative multi-follower tri-level (MFTL) decision problem which appears in many real-world applications. To solve this problem, we need to find an optimal solution achieving both the Stackelberg equilibrium in the three-level vertical structure and the Nash equilibrium among multiple followers at the same horizontal level. In this paper, we first propose a general linear MFTL decision model for this situation. We then develop a MFTL Kth-Best algorithm to find an optimal solution to the model. Since the optimal solution means a compromised result in the uncooperative situation and it is often imprecise or ambiguous for decision entities to identify their related satisfaction, we use a fuzzy programming approach to characterize and evaluate the solution obtained. Lastly, a real-world case study on production-inventory planning illustrates the effectiveness of the proposed MFTL decision techniques
Analysis and optimization of material flow inside the system of rotary coolers and intake pipeline via discrete element method modelling
There is hardly any industry that does not use transport, storage, and processing of particulate solids in its production process. In the past, all device designs were based on empirical relationships or the designer's experience. In the field of particulate solids, however, the discrete element method (DEM) has been increasingly used in recent years. This study shows how this simulation tool can be used in practice. More specifically, in dealing with operating problems with a rotary cooler which ensures the transport and cooling of the hot fly ash generated by combustion in fluidized bed boilers. For the given operating conditions, an analysis of the current cooling design was carried out, consisting of a non-standard intake pipeline, which divides and supplies the material to two rotary coolers. The study revealed shortcomings in both the pipeline design and the cooler design. The material was unevenly dispensed between the two coolers, which combined with the limited transport capacity of the coolers, led to overflowing and congestion of the whole system. Therefore, after visualization of the material flow and export of the necessary data using DEM design measures to mitigate these unwanted phenomena were carried out.Web of Science117art. no. 184
Emergent complex neural dynamics
A large repertoire of spatiotemporal activity patterns in the brain is the
basis for adaptive behaviour. Understanding the mechanism by which the brain's
hundred billion neurons and hundred trillion synapses manage to produce such a
range of cortical configurations in a flexible manner remains a fundamental
problem in neuroscience. One plausible solution is the involvement of universal
mechanisms of emergent complex phenomena evident in dynamical systems poised
near a critical point of a second-order phase transition. We review recent
theoretical and empirical results supporting the notion that the brain is
naturally poised near criticality, as well as its implications for better
understanding of the brain
- âŠ